Microsoft tried and failed with Tay, its chatbot that turned racist and sexist in less than 24 hours before being pulled off the market. The company is now at it again with Zo, a new chatbot looking for friends on Kik, but don't expect any deep conversations.