Smart speakers Amazon Alexa and Google Home open up a world of convenience for users with the use of voice commands.
However, these devices can also make these users vulnerable to a host of privacy issues, especially when third-party developers are involved.
Alexa, Google Home Become 'Smart Spies'
According to a report from Germany's Security Research Labs, researchers from their laboratory were able to find two possible hacking scenarios for both Amazon Alexa and Google Home. With the flaws found in the devices, hackers can eavesdrop on users and phish for sensitive information, basically turning the speakers into "Smart Spies."
"It was always clear that those voice assistants have privacy implications—with Google and Amazon receiving your speech, and this possibly being triggered on accident sometimes," Fabian Bräunlein, SRLabs senior security consultant, explained to Ars Technica. "We now show that, not only the manufacturers, but ... also hackers can abuse those voice assistants to intrude on someone's privacy."
To find the weaknesses in the smart speaker systems, the researchers from SRLabs created four apps for Alexa and four for Google Home, all of which passed the security vetting processes of both companies. Seven of these apps posed as simple horoscope apps and one posed as a random number generator.
How The Apps Spied On Users
All eight malicious apps followed a similar path, starting with the users triggering it by requesting for an app-related action, such as their horoscope. Apps designed to eavesdrop on the users respond with the information they requested, while phishing apps provide a fake error message.
Afterward, the apps appear to stop running, when they are actually gearing up for the next step of the attack. The eavesdropping apps either go silent because the task is completed or the user gave the command "stop" to terminate the app. Unbeknownst to the users, the app is silently logging their conversations and sending a copy to the servers designed by the app developers.
Meanwhile, phishing apps give a fake error message, then appear to cease running. Roughly a minute later, the apps "speak" with a voice mimicking the voice used by Alexa and Google Home, claiming that a device update is available and asking the users for their password to install it.
All eight apps hid their malicious behavior in similar ways, first by exploiting a flaw in the speakers' text-to-speak engines when they try to speak the character "�." This specific unpronounceable character prompted the speakers to keep silent even with the apps still running, which makes it seem that the apps have already been closed.
The apps also used the invocation phrases, such as "My Lucky Horoscope" or "give me the horoscope." When the apps were approved by Amazon and Google, the developers manipulated the original intent to give the phrases new functions.
For the malicious apps to succeed in their "smart spying," users don't even need to download anything.
The apps have already been taken down and both companies are improving review processes to prevent similar apps from getting into their stores. However, the report shows how potentially vulnerable these smart speaker systems are to manipulation.