Skip to main contentSkip to navigationSkip to navigation
An Apple HomePod
An Apple HomePod. The report was called I’d Blush if I Could, which is Siri’s response to the phrase: ‘You’re a slut’. Photograph: Samuel Gibbs/The Guardian
An Apple HomePod. The report was called I’d Blush if I Could, which is Siri’s response to the phrase: ‘You’re a slut’. Photograph: Samuel Gibbs/The Guardian

Digital assistants like Siri and Alexa entrench gender biases, says UN

This article is more than 4 years old

Female-voiced tech often gives submissive responses to queries, Unesco report finds

Assigning female genders to digital assistants such as Apple’s Siri and Amazon’s Alexa is helping entrench harmful gender biases, according to a UN agency.

Research released by Unesco claims that the often submissive and flirty responses offered by the systems to many queries – including outright abusive ones – reinforce ideas of women as subservient.

“Because the speech of most voice assistants is female, it sends a signal that women are obliging, docile and eager-to-please helpers, available at the touch of a button or with a blunt voice command like ‘hey’ or ‘OK’,” the report said.

“The assistant holds no power of agency beyond what the commander asks of it. It honours commands and responds to queries regardless of their tone or hostility. In many communities, this reinforces commonly held gender biases that women are subservient and tolerant of poor treatment.”

The publication was entitled I’d Blush if I Could; a reference to the response Apple’s Siri assistant offers to the phrase: “You’re a slut.” Amazon’s Alexa will respond: “Well, thanks for the feedback.”

The paper said such firms were “staffed by overwhelmingly male engineering teams” and have built AI systems that “cause their feminised digital assistants to greet verbal abuse with catch-me-if-you-can flirtation”.

It added: “The subservience of digital voice assistants becomes especially concerning when these machines – anthropomorphised as female by technology companies – give deflecting, lacklustre or apologetic responses to verbal sexual harassment.

“This harassment is not, it bears noting, uncommon. A writer for Microsoft’s Cortana assistant said that ‘a good chunk of the volume of early-on enquiries’ probe the assistant’s sex life.”

It cited research by a firm that develops digital assistants that suggested at least 5% of interactions were “unambiguously sexually explicit” and noted the company’s belief that the actual number was likely to be “much higher due to difficulties detecting sexually suggestive speech”.

Saniye Gülser Corat, Unesco’s director for gender equality, said: “The world needs to pay much closer attention to how, when and whether AI technologies are gendered and, crucially, who is gendering them.”

Unesco said the relatively recent introduction of such technology provided an opportunity to develop less damaging norms in its application.

It called for digital assistants not to be made female by default and said technology firms should explore the feasibility of developing a neutral machine gender that is neither male nor female. It added that they should programme such technology to discourage gender-based insults and abusive language, as well as designing assistants to be interchangeable across devices and defining them as “non-human at the outset of interactions with human users”.

More on this story

More on this story

  • Wearable tech: how the human body can help power the future of smart textiles

  • Amazon’s Alexa could turn dead loved ones’ voices into digital assistant

  • Alexa, why have you charged me £2 to say the Hail Mary?

  • Sonos launches cheaper Ray soundbar and new voice control system

  • Amazon Echo Show 15 review: bigger Alexa is good, but not yet better

  • Amazon confirms Alexa outage in UK and mainland Europe

  • Alexa whistleblower demands Amazon apology after being jailed and tortured

  • Amazon launches home robot Astro and giant Alexa display

  • Smart speakers risk creating 'big-tech monopoly' in homes

  • Sonos apologises over plan affecting older smart speakers

Most viewed

Most viewed