Alexa, Siri, and Google Assistant promote sexist attitudes towards women, says UN


A report by UNESCO has advised that the default use of female-sounding voice assistants in our sensible dwelling devices and smartphones perpetuates sexist attitudes in direction of ladies.

The report, titled I would Blush if I May, takes its title from Siri’s former default response to being referred to as a bitch by customers – and criticizes the truth that Apple’s Siri, Amazon Alexa, Google Assistant, and Microsoft’s Cortana, are “exclusively female or female by default, both in name and in sound of voice”.

Why is that this an issue? Nicely, in line with the report, the default use of female-sounding voice assistants sends a sign to customers that girls are “obliging, docile and eager-to-please helpers, accessible on the contact of a button or with a blunt voice command like ‘hey’ or ‘OK’”. 

The report additionally highlights the truth that these voice assistants have “no power of agency beyond what the commander asks of it” and responds to queries “regardless of [the user’s] tone or hostility”.

In keeping with the report, this has the impact of reinforcing “commonly held gender biases that women are subservient and tolerant of poor treatment”. 

The Apple HomePod (Image credit: TechRadar)

The Apple HomePod (Picture credit score: TechRadar)

Worrying implications

This subservience is especially worrying when these female-sounding voice assistants give “deflecting, lackluster or apologetic responses to verbal sexual harassment”. 

With not less than 5% of interactions with voice assistants being unambiguously sexually express, it isn’t precisely unusual, both – and their responses are troubling. 

In keeping with a report by Quartz in 2017, when requested ‘who’s your daddy?’, Siri responded with ‘you’re’ – and when Alexa was advised ‘you are sizzling’, the assistant responded with ‘that is good of you to say’.

With voice assistants sounding extra lifelike on a regular basis, it isn’t an enormous leap to counsel that these evasive responses might “reinforce stereotypes of unassertive, subservient women in service positions”.

Since then, Alexa has been updated to disengage with verbal harassment, as a substitute saying “I’m not going to respond to that”, or “I’m not sure what outcome you expected”.

Image credit: Google

Ladies to the entrance

Why does it matter if voice assistants sound feminine as default? Nicely, it might probably have an effect on the way in which we behave in direction of ladies and women in actual life. 

Because the report says, College of Southern California sociology professor Safiya Umoja Noble discovered that “virtual assistants produce a rise of command-based speech directed at women’s voices”. 

“Professor Noble says that the instructions barked at voice assistants – equivalent to ‘find x’, ‘call x’, ‘change x’ or ‘order x’ – perform as ‘powerful socialization tools’ and educate individuals, particularly kids, about ‘the role of women, girls, and people who are gendered female to respond on demand’.”

So, how has this been allowed to occur? Why are female-sounding voice assistants so ubiquitous? In keeping with UNESCO, the issue lies within the lack of girls within the room when tech corporations design their AI voice assistants, and in STEM (science, expertise, engineering, and maths) industries as entire.

With simply 7% of ICT patents generated by ladies throughout G20 nations, these points present “a powerful illustration of gender biases coded into technology products, pervasive in the technology sector and apparent in digital skills education”. 

In addition to recommending that the digital gender hole be shortened by “recruiting, retaining and promoting women in the technology sector”, the report additionally recommends that extra voice assistants ought to have male-sounding voices as default, ending the “practice of making digital assistants female by default”. 

In keeping with CNet, Amazon and Apple did not reply to its requests for remark and Microsoft declined to offer remark following its protection of the report. 

Google then again, says that it is “developed a wide range of 10 voice choices within the US and that when prospects arrange a Google Home machine, they’ve a 50-50 likelihood of getting both a historically feminine sounding voice, or a historically male sounding voice”.

Via CNet


Facebook Comments

This website uses cookies to improve your experience. We'll assume you're ok with this, but you can opt-out if you wish. Accept Read More