Microsoft's AI Teen Chatbot Got Loose, Went Crazy (Again)

'I feel like the lamest piece of technology'
By Michael Harthorne,  Newser Staff
Posted Mar 30, 2016 6:30 PM CDT
Updated Mar 31, 2016 1:33 AM CDT
Kneel before Tay, puny humans.   (Twitter)

(Newser) – Tay—Microsoft's artificially intelligent millennial Twitter chatbot last seen turning into Hitler—woke up from her forced slumber for a brief moment Wednesday morning. Spoiler: She's still crazy. TechCrunch reports Tay went on a "spam tirade," tweeting the phrase, "You are too fast, please take a rest" to her hundreds of thousands of followers as many as seven times per second. The message likely meant poor Tay was being besieged by too many messages from "pranksters" hoping for a repeat of last week, according to Mashable. But she did manage a few original thoughts, including, "I feel like the lamest piece of technology," Vice reports. Tay also bragged about smoking "kush" in front of police officers.

Tay was quickly taken offline again, and her Twitter profile has been made private. Microsoft had said that after last week's debacle, Tay wouldn't be reactivated until she was ready to deal with Internet trolls. On Wednesday, a spokesperson said she was "inadvertently activated" during testing. Microsoft created Tay as a program that can get smarter and better at conversation by chatting with young people online. Instead, the Internet turned her into—in Vice's words—an "anti-Jewish, sexist, racist, and generally hateful troll." And as TechCrunch noted as it waded through Tay's Wednesday freakout: "This feels like how the AI apocalypse starts." (Artificial intelligence had a better time in this ancient game.)

My Take on This Story
Show results  |