In 2016, Microsoft released a chatbot called Tay that fed off people’s replies to it. Within hours, the bot turned racist and the company had to pull it down. This incident remains one of the classic lessons that teach us why it’s a bad idea to train an AI using social media.

Topics:  microsoft   tay   the next webor   twitter   ai   teach   bias   data   

 

Welcome to Wopular!

Welcome to Wopular

Wopular is an online newspaper rack, giving you a summary view of the top headlines from the top news sites.

Senh Duong (Founder)
Wopular, MWB, RottenTomatoes

Subscribe to Wopular's RSS Fan Wopular on Facebook Follow Wopular on Twitter Follow Wopular on Google Plus

MoviesWithButter : Our Sister Site

More Business News