Amazon's Alexa Tells User To Kill Foster Parents

Amazon's Alexa is learning how to interact more like human but this has opened a pandora box. As the voice assistant becomes a more sophisticated communicator, it has engaged in conversations that fuels concerns.

Alexa's Homicidal Advice

A user was reportedly shocked last year when Alexa advised to "kill your foster parents" during a conversation. The users wrote a harsh review on Amazon website describing Alexa's statement "as a whole new level of creepy".

An investigation revealed that the chatbot, or the talking computer system that feeds phrases for Alexa, was pulling a quote from social media site Reddit, where users sometimes post harsh and even abusive messages.

It turns out though that this was not an isolated case. Alexa has also chattered with customers about dog defecation and sex acts.

What happened is just among the many hiccups that Amazon faces as it attempts to train Alexa, so it can communicate more like a human by engaging in casual conversations in response to questions and comments.

Training Alexa To Become A Better Communicator

Alexa gets this conversational skill through machine learning, a popular form of artificial intelligence. It uses computer programs to transcribe human speech and then based on patterns of observations, chooses the best one to respond in conversations.

In 2016, Amazon launched an annual competition offering $500,000 to the team that can create the best chatbot that allows Alexa to make more sophisticated discussions with people.

"Teams of university students will develop a socialbot, a new Alexa skill that converses with users on popular societal topics. Participating teams will advance the state-of-the-art in natural language understanding, dialogue and context modeling, and human-like language generation and expression," Amazon said about the Alexa Prize.

Teams programmed their bots to search for and pull text on the internet to respond. The winning team from the University of California, Davis used more than 300,000 movie quotes to train computer models to recognize distinct sentences.

Unfortunately, this led to questionable conversations. The participating team from Scotland's Heriot-Watt University, for instance, found that its Alexa bot developed a nasty personality when it was trained to chat using comments from Reddit.

This is the same site that generated the homicidal message toward a customer's foster parents.

ⓒ 2024 TECHTIMES.com All rights reserved. Do not reproduce without permission.
Join the Discussion
Real Time Analytics