Amazon announces a new policy that allows Alexa customers to opt out of the human review of their voice recordings.
The new policy gives users the option to remove their recordings from a large pool that Amazon employees and third-party contractors access and review.
According to Bloomberg, Amazon's policy became effective on Friday, Aug. 2.
Amazon's Opt-Out For Human Review
Back in April, reports revealed that Amazon employees all over the world are listening to and transcribing personal voice recordings made by Alexa.
It is done to improve the algorithms of Alexa through supervised learnings, but the Bloomberg report that spilled the news pointed out that Amazon's program involved thousands of contractors worldwide and are done with many users unaware that it's happening. Additionally, recordings made through Alexa could potentially include personal data, such as names and locations, of the users.
While Amazon responded to the reports by saying only a small sample of recordings are included in these human reviews, many criticized the company's program violating user privacy, especially since the program wasn't disclosed in the terms and conditions.
Now, the Alexa app includes a disclaimer that acknowledges the possibility of human reviews of Alex recordings.
"We take customer privacy seriously and continuously review our practices and procedures," an Amazon spokeswoman wrote in an email to Bloomberg. "We'll also be updating information we provide to customers to make our practices more clear."
How To Opt Out
It takes a few steps to opt out of Amazon's policy with users first needing to go to the settings menu of the Alexa app. Then, select "Alexa Privacy," and go to "Manage How Your Data Improves Alexa."
In this section, text appears: "With this setting on, your voice recordings may be used to develop new features and manually reviewed to help improve our services. Only an extremely small fraction of voice recordings are manually reviewed." Users can turn off the program here.
Other Companies
Amazon follows the footsteps of other major tech companies suspending similar processes over privacy issues, including Apple and Google who were both conducting human reviews via Siri and OK Google. A whistleblower revealed Apple's practices, while a Google contractor leaked a thousand voice recordings to a news organization.
Now, only users of Apple who explicitly agree to the practice will be subjected to human review. On the other hand, Google users can completely switch off the audio data recordings or opt to have them automatically deleted every three to 18 months.