Yahoo India Web Search

Search results

  1. Tay was a chatbot that was originally released by Microsoft Corporation as a Twitter bot on March 23, 2016. It caused subsequent controversy when the bot began to post inflammatory and offensive tweets through its Twitter account, causing Microsoft to shut down the service only 16 hours after its launch. [1]

  2. Mar 24, 2016 · Less than a day after she joined Twitter, Microsoft's AI bot, Tay.ai, was taken down for becoming a sexist, racist monster. AI experts explain why it went terribly wrong.

  3. Mar 24, 2016 · It took less than 24 hours for Twitter to corrupt an innocent AI chatbot. Yesterday, Microsoft unveiled Tay — a Twitter bot that the company described as an experiment in "conversational...

  4. Mar 25, 2016 · Microsoft has apologised for creating an artificially intelligent chatbot that quickly turned into a holocaust-denying racist. But in doing so made it clear Tay's views were a result of...

  5. Nov 25, 2019 · UPDATE 4 JANUARY 2024: In 2016, Microsoft’s chatbot Tay—designed to pick up its lexicon and syntax from interactions with real people posting comments on Twitter—was barraged with antisocial ideas and vulgar language. Within a few hours of it landing in bad company, it began parroting the worst of what one might encounter on social media.

  6. Mar 24, 2016 · Today, Microsoft had to shut Tay down because the bot started spewing a series of lewd and racist tweets. Tay was set up with a young, female persona that Microsoft's AI programmers...

  7. May 10, 2023 · Microsoft's Tay AI went from a promising AI to a total catastrophe in less than a day. Here's what the company learned.

  8. Jul 24, 2019 · In March 2016, Microsoft sent its artificial intelligence (AI) bot Tay out into the wild to see how it interacted with humans. According to Microsoft Cybersecurity Field CTO Diana Kelley, the...

  9. Mar 25, 2016 · Tay – a chatbot created for 18- to 24- year-olds in the U.S. for entertainment purposes – is our first attempt to answer this question. As we developed Tay, we planned and implemented a lot of filtering and conducted extensive user studies with diverse user groups.

  10. Mar 23, 2016 · But things were going to get much worse for Microsoft when a chatbot called Tay started tweeting offensive comments seemingly supporting Nazi, anti-feminist and racist views. The idea was that...