WARNING: These 1,000 Phrases Can Incorrectly Activate Siri, Alexa, and Google Assistant: Privacy Intrusion Might Happen

Siri, Alexa, and Google Assistant can be incorrectly activated by 1,000 phrases discovered by new research; "Montana" and 'Election" are among the words that can trigger these voice assistants. According to Arstechnica's latest report, privacy advocates have grown concerned that more risk than benefit can be posed by the users near-constant listening to nearby conversations since many voice assistant apps, such as Google Home, Siri, and Alexa, have become fixtures in millions of homes.

This may result in an unacceptable intrusion since fragments of potential private conversation can end up in the company logs. According to the report, law enforcement authorities investigating a murder in 2016, subpoenaed Amazon for Alexa data transmitted in the moments leading up to the crime, showing that the risk of privacy isn't solely theoretical. According to The Guardian's 2019 report, sensitive conversations recorded by Siri were transcribed by Apple's employees.

Private discussions between patients and doctors, seemingly criminal deals, sexual encounters, and business deal, were included. The researchers analyzed voice assistants from Apple, Amazon, Microsoft, Google, and Deutsche Telekom, including three Chinese models by Baidu, Xiaomi, and Tencent. The study results were published on Tuesday, June 30, focusing on Apple, Amazon, Google, and Microsoft, which didn't immediately respond to a request for comment.

Also Read: BEWARE: New Powerful Android Malware, FakeSpy, Targets Royal Mail UK and Various Postal and Delivery Service

ⓒ 2024 TECHTIMES.com All rights reserved. Do not reproduce without permission.
Join the Discussion
Real Time Analytics