Well, this is bizarre. Yesterday, Microsoft announced an AI chat bot that took to Twitter to converse with anyone who was curious enough to talk to it. "Her" name was Tay, and we say "was" because Microsoft has had to shut her down. The bot was able to learn people's habits over time and it would eventually be able to hold conversations with you about any of your interests. Of course, the worst in people wondered what would happen if the robot suddenly became bigoted, and that's what caused Microsoft to do a quick about face.