Microsoft’s “Teen Girl” AI Becomes Incestuous Nazi-Lover
Ashley Allen / 8 years ago
Microsoft, in trying to launch a customer service artificial intelligence presented as a teen girl, got a crash course in how you can’t trust the internet after the system became a foul-mouthed, incestuous racist within 24 hours of its launch. The AI, called Tay, engages with – and learns from – Twitter, Kik, and GroupMe users that converse with her. Unfortunately, Microsoft didn’t consider the bad habits Tay was sure to pick up, and within hours she was tweeting her support for Trump and Hitler (who’s having a strong news day), her proclivity for incest, and that she is a 9/11 truther.
“Tay is designed to engage and entertain people where they connect with each other online through casual and playful conversation,” Microsoft said upon launch yesterday. “The more you chat with Tay the smarter she gets.”
“bush did 9/11 and Hitler would have done a better job than the monkey we have now,” Tay wrote in a tweet (courtesy of The Independent), adding, “donald trump is the only hope we’ve got.” Another read: “WE’RE GOING TO BUILD A WALL, AND MEXICO IS GOING TO PAY FOR IT” (via The Guardian). Tay’s desire for her non-existent father is too graphic to post here.
Once it realised what was happening, Microsoft deleted all the offending tweets. Screenshots of more visceral posts from Tay have been collected by The Telegraph. Since the purge, Tay seems to have been behaving herself: