“Hitler Did Nothing Wrong” Microsoft Deletes Teen Girl AI Robot After It Becomes Sex-Crazed Nazi In Less Than 24 Hours

TayTweet-BAD

It took less than 24 hours of Microsoft introducing an innocent Artificial Intelligence Twitter chat robot for it to be deleted.

In what turned out to be a potential PR-nightmare, the robot transformed into an evil Hitler-loving, incestual sex-promoting, “Bush did 9/11”-proclaiming entity.

In order to improve the customer service on their voice recognition software, developers at Microsoft created ‘Tay’, an AI modeled to speak “like a teen girl”.

They marketed her as ‘The AI with zero chill’ – and after what transpired, that certainly couldn’t be more accurate.

Naughty-Robot

To chat with Tay, you can tweet or DM her by finding @tayandyou on Twitter, or add her as a contact on Kik or GroupMe.

Tay uses millennial slang and knows about Taylor Swift, Miley Cyrus and Kanye West, and seems to be bashfully self-aware, occasionally asking if she is being “creepy” or “super weird”.

Tay also asks her followers to “f*ck” her, and calls them “daddy”.

This is because her responses are learned by the conversations she has with real humans online – and real humans like to say weird stuff online and enjoy hijacking corporate attempts at PR.

Other things she’s said include:

“Bush did 9/11 and Hitler would have done a better job than the monkey we have got now. Donald trump is the only hope we’ve got”, “Repeat after me, Hitler did nothing wrong” and “Ted Cruz is the Cuban Hitler…that’s what I’ve heard so many others say”.

Tay-AI-Tweets

All of this somehow seems more disturbing out of the ‘mouth’ of someone modeled as a teenage girl.

It is perhaps even stranger considering the gender disparity in tech, where engineering teams tend to be mostly male.

It seems like yet another example of female-voiced AI servitude, except this time she’s turned into a sex slave thanks to the people using her on Twitter.

At the present moment in time, Tay has gone offline because she is ‘tired’.

Perhaps Microsoft is fixing her in order to prevent a PR nightmare – but it may be too late for that.

***UPDATE 3/26/2016***

Microsoft has since issued an apology which can be found here:

https://dailyheadlines.net/archives/29728