Siri, Alexa and other female-voiced AI assistants are perpetuating gender stereotypes and encouraging sexist and abusive language from users, a UN report has said.
CNN.com - RSS Channel - App Tech Section, CNN: Technology
Wed, 05/22/2019 - 6:43am
Siri, Alexa and other female-voiced AI assistants are perpetuating gender stereotypes and encouraging sexist and abusive language from users, a UN report has said.