‘How can you harass code?’ UN report calling ‘feminine’ Alexa & Siri SEXIST prompts ridicule online
The report, released Wednesday by the UN’s cultural and scientific body UNESCO, found that the majority of AI assistant products – from how they sound to their names and personalities –were designed to be seen as feminine. They were also designed to respond politely to sexual or gendered insults from users, which the report said lead to the normalization of sexual harassment and gender bias.
Using the example of Apple’s Siri, the researchers found that the AI assistant was programmed to respond positively to derogatory remarks like being called “a bitch,” replying with the phrase “I’d blush if I could.”
“Siri’s submissiveness in the face of gender abuse – and the servility expressed by so many other digital assistants projected as young women – provides a powerful illustration of gender biases coded into technology products,” the study said.
The report warned that as access to voice-powered technology becomes more prevalent around the world, this feminization could have a significant cultural impact by spreading gender biases.
However, many have responded with ridicule to the UN report on social media, asking questions like “how can you sexually harass code?” and accusing the UN of assuming Siri’s gender.
How can you sexually harass code? This is why you cant print papers anymore.— Alan (@AlanMotchell) May 22, 2019
So apparently Alexa and Siri are sexist... Did the UN just assume their gender??— Mat ©ooke (@lotharmat) May 22, 2019
Others lamented the futility of the report, pointing that as long as the voice is changeable, they don’t see how it could be made into a problem.
It’s a lose-lose article.If Siri defaults female it is sexist as assistants are female.If Siri is male it’s sexist as men only want to talk to men and don’t consider women equal.As it’s changeable it makes no fucking difference.— Kinky of Borg (@CapnKink) May 22, 2019
Now the UN says Alexa, Siri are sexist because they’re female voices. Am I the only female who was 1) pleasantly surprised to hear female voices and 2) expected to hear men’s voices? True, the gender preference should be offered as a choice.— S. J. Seymour (@1seashell) May 22, 2019
Meanwhile, Amy Dielh, a researcher on unconscious gender bias at Shippensburg University in Pennsylvania suggested that manufacturers should “stop making digital assistants female by default & program them to discourage insults and abusive language.”
Most voice assistants have female names and submissive personalities. Ex: Siri responds to insults w/ "I'd blush if I could." To fix: stop making digital assistants female by default & program them to discourage insults & abusive language. #AI#genderbiashttps://t.co/SP2W0vS7Vx— Amy Diehl, PhD (@amydiehl) May 20, 2019
But the UN’s calls for gender-neutral digital assistants may already be becoming a reality. In March, researchers unveiled Q, a voice that can be used by AI assistants and smart speakers and developed to sound “neither male nor female.” In an eerie introductory video, Q says it’s been created “for a future where we’re no longer defined by gender, but rather how we define ourselves.”
Think your friends would be interested? Share this story!