TechMicrosoft Deeply Sorry For Offensive, Hurtful Tweets Of Tay AI Chatbot Microsoft apologizes for the AI chatbot Tay's obscene behavior on Twitter, saying that it's deeply sorry for the 'offensive' and 'hurtful' tweets. The company has already removed more than 96,000 posts of its AI child.by Vincent Lanaria
TechMicrosoft's Tay AI Chatbot On Twitter Sleeps For Now After Racist, Sexist Posts: What Went Wrong?by Anu Passary
TechMicrosoft's Tay AI Chatbot Learns How To Be Racist And Misogynistic In Less Than 24 Hours, Thanks To Twitterby Kyle Nofuente
TechMeet Tay, Microsoft's AI Chatbot: @TayandYou Posts Almost 100K Tweets In Less Than 24 Hoursby Anu Passary