Home

jutalom folyékony Kortárs tay bot twitter lelkesedés Levél dominálnak

Microsoft chatbot is taught to swear on Twitter - BBC News
Microsoft chatbot is taught to swear on Twitter - BBC News

Tay (chatbot) - Wikipedia
Tay (chatbot) - Wikipedia

Microsoft briefly reinstates Tay – the Twitter AI that turned racist in 24  hours
Microsoft briefly reinstates Tay – the Twitter AI that turned racist in 24 hours

Microsoft Chatbot Snafu Shows Our Robot Overlords Aren't Ready Yet : All  Tech Considered : NPR
Microsoft Chatbot Snafu Shows Our Robot Overlords Aren't Ready Yet : All Tech Considered : NPR

Microsoft's Tay chatbot returns briefly and brags about smoking weed |  Mashable
Microsoft's Tay chatbot returns briefly and brags about smoking weed | Mashable

Microsoft scrambles to limit PR damage over abusive AI bot Tay | Artificial  intelligence (AI) | The Guardian
Microsoft scrambles to limit PR damage over abusive AI bot Tay | Artificial intelligence (AI) | The Guardian

Twitter taught Microsoft's AI chatbot to be a racist asshole in less than a  day - The Verge
Twitter taught Microsoft's AI chatbot to be a racist asshole in less than a day - The Verge

Microsoft shuts down AI chatbot, Tay, after it turned into a Nazi - CBS News
Microsoft shuts down AI chatbot, Tay, after it turned into a Nazi - CBS News

Microsoft AI reactivates, promptly becomes druggie | The Times of Israel
Microsoft AI reactivates, promptly becomes druggie | The Times of Israel

Microsoft Created a Twitter Bot to Learn From Users. It Quickly Became a  Racist Jerk. - The New York Times
Microsoft Created a Twitter Bot to Learn From Users. It Quickly Became a Racist Jerk. - The New York Times

Microsoft AI bot Tay returns to Twitter, goes on spam tirade, then back to  sleep | TechCrunch
Microsoft AI bot Tay returns to Twitter, goes on spam tirade, then back to sleep | TechCrunch

What went wrong with Tay, the Twitter bot that turned racist? - Kavita  Ganesan, PhD
What went wrong with Tay, the Twitter bot that turned racist? - Kavita Ganesan, PhD

Microsoft's Tay AI chatbot goes offline after being taught to be a racist |  ZDNET
Microsoft's Tay AI chatbot goes offline after being taught to be a racist | ZDNET

Microsoft shuts AI bot after Twitter teaches it racism - The Hindu
Microsoft shuts AI bot after Twitter teaches it racism - The Hindu

Microsoft silences its new A.I. bot Tay, after Twitter users teach it  racism [Updated] | TechCrunch
Microsoft silences its new A.I. bot Tay, after Twitter users teach it racism [Updated] | TechCrunch

Microsoft's Tay is an AI chat bot with 'zero chill' | Engadget
Microsoft's Tay is an AI chat bot with 'zero chill' | Engadget

Tay the 'teenage' AI is shut down after Microsoft Twitter bot starts  posting genocidal racist comments that defended HITLER one day after  launching | Daily Mail Online
Tay the 'teenage' AI is shut down after Microsoft Twitter bot starts posting genocidal racist comments that defended HITLER one day after launching | Daily Mail Online

Trolls turned Tay, Microsoft's fun millennial AI bot, into a genocidal  maniac - The Washington Post
Trolls turned Tay, Microsoft's fun millennial AI bot, into a genocidal maniac - The Washington Post

Microsoft Research and Bing release Tay.ai, a Twitter chat bot aimed at  18-24 year-olds - OnMSFT.com
Microsoft Research and Bing release Tay.ai, a Twitter chat bot aimed at 18-24 year-olds - OnMSFT.com

Microsoft's AI Twitter Bot That Went Racist Returns ... for a Bit
Microsoft's AI Twitter Bot That Went Racist Returns ... for a Bit

Remembering Microsoft's Chatbot disaster | by Kenji Explains | UX Planet
Remembering Microsoft's Chatbot disaster | by Kenji Explains | UX Planet

Racist Twitter Bot Went Awry Due To “Coordinated Effort” By Users, Says  Microsoft
Racist Twitter Bot Went Awry Due To “Coordinated Effort” By Users, Says Microsoft

Microsoft's Artificial Intelligence Tay Became a 'Racist Nazi' in less than  24 Hours
Microsoft's Artificial Intelligence Tay Became a 'Racist Nazi' in less than 24 Hours

Tay: Microsoft issues apology over racist chatbot fiasco - BBC News
Tay: Microsoft issues apology over racist chatbot fiasco - BBC News

Microsoft's racist teen bot briefly comes back to life, tweets about kush
Microsoft's racist teen bot briefly comes back to life, tweets about kush

The Saga of Twitter Bot Tay | Snopes.com
The Saga of Twitter Bot Tay | Snopes.com