Alexa, Siri, And Google Assistant Follow Malicious Voice Commands Hidden In Music

Researchers from the United States and China have demonstrated the possibility of hiding malicious voice commands in music that Amazon's Alexa, Apple's Siri, and Google Assistant will follow.

Digital assistant security problems are not new. However, this latest one is different because it could victimize even the most careful users, as the subliminal voice commands are undetectable to the human ear.

Malicious Voice Commands For Digital Assistants Hidden In Music

Over the past two years, researchers have worked within university labs on hidden commands that only Siri, Alexa, and Google Assistant will pick up, and not humans.

In 2016, students from Georgetown University and UC Berkeley showed that it was possible to hide commands within white noise coming from loudspeakers or YouTube videos to make smart devices perform functions like opening websites and activating airplane mode. In the latest development of the research, some of the UC Berkeley students determined a way to hide commands within music or spoken text recordings.

"We wanted to see if we could make it even more stealthy," said UC Berkeley fifth-year computer security Ph.D. student Nicholas Carlini, one of the authors of the research that has been published online. The group provided samples of songs where voice commands have been embedded to make digital assistants do specific things, including visiting websites, turning on GPS, and making phone calls.

Are Hackers Already Using The Method?

According to Carlini, there is no evidence that attackers have started using this method in the real world. However, he added that it might just be a matter of time before they do, with Carlini assuming that hackers have already started trying to copy his team's work.

While the undetected voice commands demonstrated by the researchers are harmless, it is easy how attackers can exploit the technique. They can have digital assistants unlock doors of smart homes, transfer money through banking apps, and purchase items from online retailers, all without the user knowing what was happening.

Regular users who would like to protect themselves from the attacks as early as now have some options available to them. For owners of devices with Alexa and Google Assistant, they can take advantage of the optional feature that locks access to personal information to a specific user based on voice patterns. Meanwhile, iPhone and iPad owners have the default protection of Siri requiring the iOS device to be unlocked before being allowed access to sensitive information.

One way for Amazon, Apple, and Google to protect their customers would be to further develop voice authentication measures so that devices will only recognize commands from the authentic user.

ⓒ 2024 TECHTIMES.com All rights reserved. Do not reproduce without permission.
Join the Discussion
Real Time Analytics