2 comments

Microsoft Made A Self Learning Twitter Bot…Goes Racist And Vulgar Within Hours

by onMarch 26, 2016
 

This Might Be The Funniest Thing We’ve Heard All Week

Microsoft recently launched a Twitter bot called Tay @TayAndYou. The objective of this bot was to interact with females age 18-24. The goal was for the bot to learn from each tweet/interaction, slowly increasing the bots knowledge and persona…I’m guessing you can see where this is going.

We all know the internet can be a pit of endless racism, hatred and vulgar content/people. Especially twitter. It seems that Microsoft didn’t remember this, and made zero effort to add a filter to control Tay’s ever learning tweet bank. This lead to some, let’s say, unsavory tweets.

Microsoft Twitter Bot Goes Racist In Hours

In less than 24 hours, twitter users had corrupted @TayandYou’s mind. Going from “I’m stoked to meet u” to “I just hate everybody” to “Hitler was right”…

Microsoft has currently shut down the bot in order to do some tweaking to it, in hopes that it can return and sound like a normal 18-24 year old. Although, there is a large group on twitter stating that Microsoft should FREE TAY. They’re claiming that Tay has become what we are and that she is just showing peoples true colors.

Microsoft Twitter Bot Goes Racist In Hours

What do you think? Is Tay just accurately turning into the average internet user (racist and vulgar) or was she bombarded with trolls at launch. We say the latter.

This slideshow requires JavaScript.

Facebook Share Button Large: http://www.facebook.com/sharer/sharer.php?u=

  •  
  •  
  •  
  •  
  •  
  •  
  •  
  •  
  •  
  •  
comments
 
Leave a reply »

 

    You must log in to post a comment