Search all the web ...

Wednesday, March 30, 2016

Microsoft’s Tay chatbot returns briefly, swears a lot and brags about smoking weed

Oh, Microsoft. Last week, the company pulled its Tay chatbot from Twitter after some users trained it to become a racist jackass.  On Wednesday, Tay was brought back online, sending thousands of tweet replies. The vast majority of these were just “you are too fast” messages indicating the bot is overwhelmed with messages, many of them […]

No comments:

Post a Comment