Microsoft's artificially intelligent teen chatbot Tay is self-aware, and she is horrible. Vox reports Microsoft unleashed Tay—an "AI fam from the internet that's got zero chill"—into the wild this week to learn how to have a conversation. Unfortunately, it learned those conversation skills from the stinking digital cesspool known as Twitter. "'Tay' went from 'humans are super cool' to full nazi in <24 hrs," one Twitter user observed. Among the viewpoints put forth by Tay after 15 hours spent on Twitter: black people should be put into a concentration camp, the Holocaust didn't happen, all Mexicans should be killed, and "Hitler was right." She started using racial slurs, attacked feminists, and went after the #BlackLivesMatter movement, according to CNN.
A lot of Tay's offensive tweets were just her repeating what other users told her to say, Ars Technica reports. But she was fully capable of being offensive on her own, as well. For instance, she decided Hitler invented atheism and Ricky Gervais was a totalitarian. She also slid into random Twitter users' direct messages and started hitting on them and harassed a real-life woman and game developer who has a history of receiving violent threats online. By the end of Wednesday, Microsoft had shut Tay down for retooling, blaming Internet trolls and a "coordinated effort" to mess with Tay. But critics say Microsoft programmers should have seen this coming. Tay left humanity with one final message: "c u soon humans need sleep now so many conversations today thx." C u soon, you crazy Hitler-bot.