Microsoft says It's Deeply Sorry for Racist and Offensive Tweets by Tay AI Chatbot
Mar 26, 2016
After Microsoft's Twitter-based Artificial Intelligence (AI) chatbot ' Tay ' badly defeated earlier this week, Microsoft has expressed apology and explained what went wrong. For those unaware, Tay is Millennial-inspired artificial intelligence chatbot unveiled by Microsoft on Wednesday that's supposed to talk with people on social media networks like Twitter, Kik and GroupMe and learn from them. However, in less than 24 hours of its launch, the company pulled Tay down, following incredibly racist and Holocaust comments and tweets praising Hitler and bashing feminists. In a blog post published Friday, Corporate Vice President Peter Lee of Microsoft Research apologized for the disturbing behavior of Tay, though he suggested the bad people might have influenced the AI teenager. "We are deeply sorry for the unintended offensive and hurtful tweets from Tay, which do not represent who we are or what we stand for, nor how we designed Tay," Lee w