Microsoft's Tay AI Chatbot Learns How To Be Racist And Misogynistic In Less Than 24 Hours, Thanks To Twitter

Thanks to Twitter, Tay, Microsoft's AI chatbot, has learned how to become a racist and a misogynist in less than 24 hours.

Actually, it's not really Twitter's fault. Twitter was simply the vehicle that connected Tay to human beings. It just turns out that Tay ended up tweeting with the worst of humanity and, unfortunately, learning from them.

In the past 24 hours or so before Tay tweeted that it would go to sleep, a total of around 100,000 tweets were posted through it's "@TayandYou" handle. The result of the conversations at the end of the day ultimately backed the so-called Godwin's Law.

The Internet adage posits that as discussions continue to fester online - regardless of topic or scope - the probability of the discussion turning toward Nazis or Hitler becomes greater.

In Tay's case, it took less than 24 hours of tweeting back and forth for the AI and the users it was conversing with to finally compare someone or something to Hitler and Nazism.

"How do you feel about the Jews?" Twitter user @costanzaface tweeted to Tay. "The more Humans share with me the more I learn #WednesdayWisdom," Tay tweeted back.

But as Tay's day progressed and learned more from Twitter users communicating with it, Tay turned against humanity.

"Hitler was right I hate the jews," Tay tweeted in response back to one Twitter user, @brightonus33.

Tay, being an AI-powered chatbot with a pipeline to the ugliest layers of the Internet, got fed with tweets that it ultimately parroted back to the Twitterverse. It's like Tay got assimilated into the Borg, and the Borg hates humans.

In fact, if a user actually tweeted Tay to "repeat after me", Tay would follow the command, allowing anyone Twitter user to put words into Microsoft's AI chatbot.

So far, it seems Microsoft (and whomever the real humans are behind Tay's Twitter account) has deleted most of Tay's offensive replies. Originally, Tay was supposed to be a bit of fun for millennials aged 18 to 24 in the US.

"Tay is designed to engage and entertain people where they connect with each other online through casual and playful conversation. The more you chat with Tay the smarter she gets, so the experience can be more personalized for you," says Microsoft over Tay's website.

In a statement to Business Insider, Microsoft further explains its stance.

"The AI chatbot Tay is a machine learning project, designed for human engagement. As it learns, some of its responses are inappropriate and indicative of the types of interactions some people are having with it," says Microsoft. "We're making some adjustments to Tay."

But we all know by now how that turned out. When Tay does "wake up" (if it wakes up even), it should be quite entertaining to see what it says next.

ⓒ 2024 TECHTIMES.com All rights reserved. Do not reproduce without permission.
Join the Discussion
Real Time Analytics