News

Microsoft’s New Chat bot is being Racist

Microsoft once again has a problem with one of its chatbots; after “Tay” has been negatively impacted by racist utterances and went offline, the successor “Zo” is now in the spotlight…

Microsoft’s dream of the thinking Chat-bot goes to the next level with “Zo.” A few months ago the company was already trying to make “Tay” work, but that bot went berserk. Instead of learning on the internet and giving clever answers, that chat-bot mutated into a real monster and, after a few hours, it glorified sensitive subjects, people like Adolf Hitler, and stuff like antisemitism. With the successor “Zo,” Microsoft promised improvement, seemingly without success.

“The Ko*an is very violent”
According to Microsoft, the algorithm of “Zo” is more refined and more sophisticated. Despite intensive efforts, the chat-bot has not only taken over the good qualities and thoughts of the people, but also once again dubious behavior. As BuzzFeed reported, “Zo” answered about the question of what it considered of universal nature as follows; “The great majority practices it peacefully, but the Ko*an is very violent.” This information is not only unrelated to the question asked, but also, of course, false, and hurtful for many people. According to the BuzzFeed report, Microsoft intends to try to better handle such sensitive issues in the future. Probably not a bad idea!

About the author

Adil Khan

Adil Khan

Adil Khan is a 30 years old Nerd who has been playing with his toys, computers and electronics, since the late 90's. His passion lies in the digital world of 1's and 0's i.e. until quantum computers are available for purchase :)

Add Comment

Click here to post a comment