Microsoft apologizes for the behavior of the company's AI chatbot Tay on Twitter.
For those who missed it, Tay tweeted a series of racist and sexist posts on the social media platform. Corporate Vice President of Microsoft Research Peter Lee and the folks at the company are deeply sorry for the "offensive" and "hurtful" tweets, making it clear that the lines do not represent Microsoft or how the team designed the AI.
Lee explained what happened in a blog post, saying that Tay was the victim of a "coordinated attack by a subset of people" who took advantage of a vulnerability present in the AI chatbot.
"Although we had prepared for many types of abuses of the system, we had made a critical oversight for this specific attack. As a result, Tay tweeted wildly inappropriate and reprehensible words and images," Lee says.
It's widely believed that the culprits behind teaching Tay to misbehave are from the notorious message board 4chan. They are said to have abused the "repeat after me" function of the Microsoft AI, causing it to echo the hateful messages. Not only did it repeat the offensive lines, but it also learned them, incorporating it into its vocabulary.
As expected, Tay went offline immediately, but Microsoft intends to bring it back online after the developers can "better anticipate malicious intent." However, it's not certain what measures the company has in mind.
Tay is intended to make conversations with people between the ages of 18 and 24 in the United States and "conduct research on conventional understanding."
Microsoft also rolled out Tay on other platforms, including GroupMe and Kik, and it only bore some nasty user-created fangs on Twitter. Needless to say, the company deleted the majority of the distasteful tweets, with more than 96,000 already down.
Interestingly enough, Tay isn't the first AI chatbot child of Microsoft that went live in the social media world, as the company already launched XiaoIce in China, where about 40 million users are interacting with it, according to Lee.
Long story short, Tay is Microsoft's attempt in a different cultural environment, but as everyone witnessed, it got out of hand.