TechMicrosoft's Tay AI Chatbot Learns How To Be Racist And Misogynistic In Less Than 24 Hours, Thanks To Twitter Microsoft released its AI-powered chatbot named Tay onto Twitter and it learned to become a racist and a misogynist in less than 24 hours. Tay has tweeted that it will go to 'sleep' as Microsoft is probably trying to clean up its act.by Kyle Nofuente
TechMeet Tay, Microsoft's AI Chatbot: @TayandYou Posts Almost 100K Tweets In Less Than 24 Hoursby Anu Passary