Microsoft Apologizes for Chatbot Tay’s Tweets

Microsoft Apologizes for Chatbot Tay’s Tweets

feature Microsoft's artificial intelligence (AI) Twitter bot named Tay, was taken down barely 24 hours after it was activated. 

As some may have guessed, the Internet did what the internet does in cases like this by teaching Tay to be "racist" and "sexist" in an incredibly short amount of time.

Microsoft said it was "deeply sorry" after Tay went on to say things like "feminism is like cancer" and that the Holocaust was "made up" and did not happen.

A blog post by VP of Microsoft Research, Peter Lee outlines the apology from Microsoft.

"We are deeply sorry for the unintended offensive and hurtful tweets from Tay…

Tay is now offline and we’ll look to bring Tay back only when we are confident we can better anticipate malicious intent that conflicts with our principles and values.

Although we had prepared for many types of abuses of the system, we had made a critical oversight for this specific attack. As a result, Tay tweeted wildly inappropriate and reprehensible words and images. We take full responsibility for not seeing this possibility ahead of time." 


It's unclear yet as to when Tay might return.

Topics:

Start your free 14-day trial of Shopify