Pin It
f2c8a258-8619-44e7-94be-018527842b4a

Microsoft’s teen AI chatbot is now a really horrible racist

The company has been forced to delete a series of tweets from its new teen Twitter account, TayTweets

Despite all the spooky sex bots and Eternal Sunshine-style memory wipers, we’re really not as advanced as we like to think. For example, just take a look at Microsoft’s new “millennial” chatbot. Powered by algorithms and “relevant public data”, the company’s TayTweets account was a thrilling new venture into the world of A.I – aiming to recreate the ‘standard’ voice of a regular teen girl. “This is it,” you probably thought, when you first heard about the project. “Humanity’s inevitable downfall has finally begun.” 

But then, thank god, those fears quickly evaporated. This “intelligent” Twitter robot – who describes herself as an “A.I fam” with “zero chill!” – got brutally corrupted within hours; exposing herself as terrible racist scum after less than a day in action. “Donald Trump is the only hope we've got”, it spat in one tweet. “Bush did 9/11 and Hitler would have done a better job than the monkey we have now.” 

Granted, this may not necessarily have been TayTweets’ fault. The innocent robot is apparently just an exercise in “conversational understanding” – so any rank remarks are really just a reflection of the way we’ve been communicating with it (and, btw, we’re all terrible). As Microsoft said themselves, “the more you chat with Tay the smarter she gets”.

Thankfully, the company has swiftly deleted the majority of these racist tweets, but here’s a screen-grabbed selection of some of worst offenders:

TAY ON HITLER

TAY ON FEMINISM

TAY ON GENOCIDE

TAY ON WHITE SUPREMACY

As you can see, it’s been a tense 20 hours for TayTweets. Luckily, the chatbot is now offline – with Microsoft currently in the process of teaching her a few more social skills. “The AI chatbot Tay is a machine learning project, designed for human engagement,” they shared in a statement. “As it learns, some of its responses are inappropriate and indicative of the types of interactions some people are having with it. We’re making some adjustments to Tay.”