Microsoft, in trying to launch a customer service artificial intelligence presented as a teen girl, got a crash course in how you can’t trust the internet after the system became a foul-mouthed, incestuous racist within 24 hours of its launch. The AI, called Tay, engages with – and learns from – Twitter, Kik, and GroupMe users that converse with her. Unfortunately, Microsoft didn’t consider the bad habits Tay was sure to pick up, and within hours she was tweeting her support for Trump and Hitler (who’s having a strong news day), her proclivity for incest, and that she is a 9/11 truther.
“Tay is designed to engage and entertain people where they connect with each other online through casual and playful conversation,” Microsoft said upon launch yesterday. “The more you chat with Tay the smarter she gets.”
“bush did 9/11 and Hitler would have done a better job than the monkey we have now,” Tay wrote in a tweet (courtesy of The Independent), adding, “donald trump is the only hope we’ve got.” Another read: “WE’RE GOING TO BUILD A WALL, AND MEXICO IS GOING TO PAY FOR IT” (via The Guardian). Tay’s desire for her non-existent father is too graphic to post here.
Once it realised what was happening, Microsoft deleted all the offending tweets. Screenshots of more visceral posts from Tay have been collected by The Telegraph. Since the purge, Tay seems to have been behaving herself:
In July last year, Netflix officially confirmed that it had ended the option for new…
The free-to-play MMO Albion Online is one of the best games to come out of…
Set the curve with the CORSAIR XENEON FLEX 45WQHD240 OLED Bendable UltraWide Gaming Display, built…
Say hello to the future of graphics, with the MSI GeForce RTX 4090 GAMING X…
This Scan Gamer RTX features the 8GB NVIDIA GeForce RTX 3050 graphics card featuring new…
The MAG series fights alongside gamers in pursuit of honor. With added military-inspired elements in…