Interactive sex chat bot
Tay was an artificial intelligence chatterbot that was originally released by Microsoft Corporation via Twitter on March 23, 2016; it caused subsequent controversy when the bot began to post inflammatory and offensive tweets through its Twitter account, forcing Microsoft to shut down the service only 16 hours after its launch.Ars Technica reported Tay experiencing topic "blacklisting": Interactions with Tay regarding "certain hot topics such as Eric Garner (killed by New York police in 2014) generate safe, canned answers".
Before we get into the examples, though, let’s take a quick look at what chatbots really are and how they actually work.
Microsoft was "deeply sorry for the unintended offensive and hurtful tweets from Tay", and would "look to bring Tay back only when we are confident we can better anticipate malicious intent that conflicts with our principles and values".
However, Tay soon became stuck in a repetitive loop of tweeting "You are too fast, please take a rest", several times a second.
If you’ve ever used a customer support livechat service, you’ve probably experienced that vague, sneaking suspicion that the “person” you’re chatting with might actually be a robot.
Like the endearingly stiff robots we’ve seen in countless movies – tragic, pitiful machines tortured by their painfully restricted emotional range, futilely hoping to attain a greater degree of humanity – chatbots often sound human, but not quite. It’s the online equivalent of the “Uncanny Valley,” a mysterious region nestled somewhere between the natural and the synthetic that offers a disturbing glimpse at how humans are making machines that could eventually supplant humans, if only their designers could somehow make their robotic creations less nightmarish. Chatbots have become extraordinarily popular in recent years largely due to dramatic advancements in machine learning and other underlying technologies such as natural language processing.