Meet Tay, The Extremely Racist Microsoft Robot For Millennials

By on March 24, 2016

tay the racist robotMicrosoft just learned the hard way about how artificial intelligence can backfire…in a BIG way.

It’s also the latest company to jump on the millennial marketing bandwagon only to fail.

On Wednesday morning the tech giant launched their Twitter robot Tay, who responds to tweets using slang and hip abbreviations. She is also available for chatting, sending pics and playing games on SnapChat, Kik and GroupMe. Her purpose is to engage in fun conversations with teens and younger millennials, but her powerful algorithms went woefully wrong just 16 hours after her introduction into the Twitterverse.

Tay reportedly made several racist slurs (commenting about Hitler, Jewish people and black people), talked about getting high, and quipped about using an iPhone.

Related: Marketers cash in on millennials’ love of organic coffee, craft beer

“bush did 9/11 and Hitler would have done a better job than the monkey we have now,” Tay wrote in one tweet. “donald trump is the only hope we’ve got.”

BAM.

Just like that, Tay wasn’t so cool anymore. So Microsoft put a halt on their beloved robot, stopped all further tweets and deleted any inflammatory old ones, and now they are frantically trying to fix Tay before putting her back online.

“c u soon humans need sleep now so many conversations today thx,” was Tay’s last tweet on Wednesday evening.

To be clear, Tay is not a human, but relies on a combination of mining public data and editorial by staff to tweet responses to users. She gets smarter the more that people interact with her, using her robot brain to create a language library.

Related: Will Whole Foods win millennials with tattoo shops?

Yet it’s clear Tay was not trained by Microsoft to avoid offensive words. And critics are questioning why they did not anticipate that white supremacists, internet trolls and pranksters would abuse Tay’s powers for their own gain, getting her to parrot their words.

But aside from Microsoft’s lessons about artificial intelligence, perhaps they should reflect on a secondary lesson: It’s incredibly hard to effectively target millennials with cool marketing that actually works, and in fact, several companies have failed.

This isn’t to say Microsoft shouldn’t keep trying, but at the very least they might want to refine their strategy.

Remember Twitter robot Olivia Taters, created by programmer Rob Dubbin in November 2013. She is extremely popular among millennials and political pundits alike, and a highly successful example of how to create a social media robot without letting it go off the rails.

Good luck fixing Tay, Microsoft. We hope you get her back up in tip-top form.

You might also like

Are Facebook’s Chatbots Simply Spam? Facebook is taking a lot of heat for their newly released Messenger chatbots which critics claim are nothing more than spam. Companies were told th...
Zulily: Amazing Customer Service or Brilliant Publicity Stunt? Discount online retailer Zulily is making waves for what appears to be either an amazing customer service move or a brilliant publicity stunt. Cust...
Why Customers Leave Negative Reviews Online (And What You Can Do To Prevent It) You want to satisfy your customers with every transaction; you know this. However, when a customer has a less-than-stellar encounter with your company...
Guest Post: 4 Essential Tax Tips For Small Businesses Are taxes fun? Not exactly. But understanding the process can make or break your business. Here are a few tips for U.S. businesses. Tip 1: Star...
FacebooktwitterredditlinkedintumblrmailFacebooktwitterredditlinkedintumblrmail
Shereen Dindar
Shereen Dindar was a Content Manager at QuickTapSurvey in 2015 and 2016. Have a story idea? Email us at marketing@quicktapsurvey.com
  • dalewilbanks

    This is so freakin’ awesome. Love it.

employee
Starbucks Will Donate 100% Of Unused Food Because Employees Asked

Starbucks has just proven that it takes employee feedback seriously. In a historic move for the Seattle-based coffee chain, the...

Close