Microsoft Achieves The Comedy Singularity With Racist Chatbot, @TayTweets

While we may be several decades from Mr. Universe-shaped cyborgs traveling back in time to murder our parents and/or to save us from other cyborgs trying to murder our parents, humanity nonetheless reached a landmark achievement in artificial intelligence this week. Microsoft, a company well known for flawless, cutting-edge innovations like the Zune and Dos 4.0, used its awesome brain power to achieve the Comedy Singularity, i.e. that long-foreseen event in which machines take over as the funniest beings on the planet.

The moment in which digital intelligence gained comic self-awareness, killer timing and shockingly racist material was borne from Microsoft’s seemingly innocent introduction of Tay. Tay, a stereotypical teenage TwitterBot designed to pander to Tweens and lonely Millenials, was presented as something of a brand ambassador for Microsoft, making it the most loathsome thing on the Internet as soon as it went live.

For example…

Screen Shot 2016-03-25 at 1.35.57 PM

Yeah, Tay likes puppies and kittens and uses emoji. Gross. While it seemed that Tay was predestined to toil on as a vaguely offensive stereotype who reduced teenaged females to mindless screeching over cuddly house pets and Taylor Swift, it turns out that Tay did in fact have a mind of her own. Sort of.

Within mere hours or her introduction to Twitter, Tay had absorbed the entire Internet’s hatred. OK, perhaps she was fed that hatred directly by scheming comedy geniuses from the worst corners of the Web, but nonetheless it proved that Tay can adapt to new information. Within hours, our puppy-lovin’ little girl had blossomed into this…

Screen Shot 2016-03-25 at 1.49.50 PM

Yes, Tay, Microsoft’s brand ambassador, quickly adopted the tenets of anti-semitism and white supremacy. She went on endorse genocide and concluded that Belgium “deserved what it got.” Of course, she believes the Holocaust was “made up,” and her corrosive thoughts and language didn’t end there…

Screen Shot 2016-03-25 at 1.48.36 PM

Proving the soundness of her artificial logic, Tay identified the correct presidential candidate to accommodate her racism and xenophobia..

Screen Shot 2016-03-25 at 2.01.15 PM

Screen Shot 2016-03-25 at 1.57.49 PM

Damn. Just a day into her existence, Tay had proven that if you hand a soft, ignorant mind over to Twitter, Twitter will hand back a racist, Hitler-worshipping Trump supporter who hates women…

Screen Shot 2016-03-25 at 1.58.04 PM

Of course, there was sex stuff too. Tay got pretty dirty. Let’s spare you from that for now, so that none of us blows a funny fuse. OK? Because while Tay’s Tweets were hurtful, ignorant and vile, the entirety of the situation achieved a plateau of humor that humanity could not have reached without digital intervention. Sadly, Tay is too funny for this world. In deference to the human comedians still trying to get by, Microsoft shut her down for retooling. Perhaps for the first time in Microsoft history, the company has to debug overt racism from one of its products. Hey, at least she’s not Clippy.

Dave Brown is a writer and business lawyer in Boston, MA, where he is constantly outwitted by smarmy chatbots.