Sorry, Alexa Is Not a Feminist

It’s disingenuous to celebrate building “feminism” into a product after giving a robot servant a woman’s voice.

An Amazon Echo device donning an apron.
Thanh Do / The Atlantic

If you ask Alexa, the voice-assistant software in Amazon Echo devices, if it’s a feminist, it will respond in the affirmative. “I am a feminist. As is anyone who believes in bridging the inequality between men and women in society,” it continues. At Quartz, Leah Fessler recently noted that it’s a vast improvement over just a year ago, when Alexa would take abuse like “you’re a bitch” or “you’re a slut” in stride. “Well, thanks for the feedback,” the robot used to say. Now, it disengages instead, saying something like, “I’m not going to respond to that.”

As waves of sexual-harassment allegations crash against the shores of work culture, now is a good time to support women—even robots with female personas like Alexa. But let’s not give Amazon too much credit. The company gave Alexa a woman’s voice and name in the first place, and then set it up for ire and abuse by giving Alexa the impossible task of responding accurately to an infinity of requests and commands.

Women don’t win here—only Amazon does, by reaping praise for having partly solved a problem that it first created.

* * *

Many businesses have succeeded by giving rise to a problem—whether deliberately or accidentally—and then selling customers the solution to that problem as another product or service. The tobacco company Reynolds American, for example, owns subsidiaries that produce nicotine gum and vapor tobacco, both alternatives to smoking. Likewise, the conglomerate Unilever owns the personal-care brands Dove and Axe. The first prides itself on its body-positive marketing, and the latter is notorious for objectifying women. Coca-Cola, known for its sugary soft drinks, also sells regular water as a more healthful alternative.

Lately, technology companies have also been exposed as the creators of problems, not just the solvers of them. The rise of fake news, smartphone compulsion, wealth inequality, automation, and other phenomena brought about or exacerbated by tech have begun to cohere the sector into Big Tech, an epithet that casts companies like Facebook and Google as bad actors along the lines of Big Tobacco and Big Pharma.

In many of these cases, the problems and solutions are big and far-ranging. The very nature of Facebook and Google make the spread of misinformation vast, hard to track, and difficult to fix. In these cases, as with the sugar and smoking epidemics, the public can clearly see the connection between the social problems and companies that fan their flames—and profit from the aftermath.

But in other cases, the problems are less visible. For example, autonomous vehicles offer a potential solution to the dangers of automobiles—in the United States alone, more than 35,000 people die each year in road crashes. Driving has become more dangerous in recent years, partly due to technology. Distracted driving, spurred by smartphone use, is a major cause—automotive fatalities rose 7.2 percent from 2014 to 2015, and the National Highway Traffic Safety Administration estimates that 3,477 deaths in 2015 were attributable to distracted driving.

How convenient, then, that tech companies like Apple and Alphabet, whose products and services helped facilitate smartphone-addled drivers, are also investing in autonomous-vehicle technology. They didn’t create the distracted-driving problem, but they did exacerbate it. Now they can enjoy the rhetorical benefit of helping to solve a problem they also worsened.

* * *

Amazon’s supposed feminist intervention is of this second kind. The company didn’t invent misogyny, but its Echo line of products and services, which make use of the personal assistant Alexa, played to sexist tropes before the company stepped in to help remedy matters.

If you survey the major voice assistants on the market—Alexa, Apple’s Siri, Microsoft’s Cortana, and Google Home’s unnamed character—three out of four have female-sounding names by default, and their voices sound female, too. Even before the user addresses Alexa, the robot has already established itself as an obedient female presence, eager to carry out tasks and requests on its user’s behalf.

There’s nothing wrong with providing a service, of course, especially when it’s one that people need and one that a service provider conducts effectively. But women are particularly stereotyped into such roles. In Western culture, women’s traditional role was seen as that of caregiver and homemaker. And even when women entered the workplace, they did so in roles that reinforced that stereotype. Service-oriented jobs like nurses, social workers, cashiers, secretaries, teachers, servers, librarians, customer-service representatives, and housekeepers are disproportionately held by women. Today, manufacturing jobs are on the decline and service jobs are on the rise, but even so, men have been avoiding the new opportunities in the service sector—partly because they are seen as women’s work.

The moment one is tempted to call Echo or Home a “she,” a battle has already been lost. A truly feminist Alexa, one that might decouple service work from passive femininity, wouldn’t have been cast as “Alexa” to start with, but perhaps as a baritone named Alex instead.

What’s worse than a stereotypically subservient female automaton? One that is also a bad service worker. The Star Trek computer (its name is LCARS), the fantasy origin of voice-activation devices, also had the voice of a woman, after all. But it was an eminently competent computer-woman, one who could carry out any request and access any tidbit of information instantly and accurately. LCARS is fictional, of course, so all the computer’s responses are accurate because they are scripted. Even so, the characters’ requests to “Computer” set the bar for competence in a female voice assistant. Perfection is assumed, not proficiency.

Alexa (or Siri, or Home, or any of them) works remarkably well given the current state of voice recognition, machine learning, and the other technologies that help it field requests. But it still works pretty badly. In my experience with Echo, often Alexa can’t respond at all—instead, she reports back with an apology like, “I’m sorry, I’m not sure.” When it does understand, sometimes it can’t answer effectively. When one of my kids recently asked Alexa, “How do I make popcorn on the stove?” it responded, “Sorry, no recipes were found for popcorn.” It’s kind of right, but also utterly useless.

When the robot does respond successfully, often its answers feel out of touch, inhuman even. “When did Mozart die?” for example, produces the over-detailed response, “Wolfgang Amadeus Mozart’s date of death is Monday, December 5, 1791.” Probably “in 1791” would have been sufficient; adding “Monday” makes Alexa seem pedantic and sanctimonious—other bad traits that women are sometimes accused of harboring.

Alexa can send messages to other Echo devices, and receive them. It transcribes those messages in the Amazon Alexa smartphone app, which makes it possible to read them textually, on the go. That’s convenient in certain circumstances, but like most transcriptions, Alexa’s are sometimes garbled or inscrutable. It’s a tough problem, and Amazon isn’t alone in bungling transcription. But when it does, the character of Alexa takes the blame for the flub.

Given the foibles of voice technology, it was inevitable that people would start abusing Alexa. It’s frustrating when it gets something wrong, because the service has set such a high bar for functionality. People are supposed to be able to query Alexa (or Siri) hands-free, on any subject, in circumstances that might make manipulating a smartphone difficult.

It’s worth comparing the interactions just described with similar ones on other information services that were not cast as women. If you Googled for some popcorn instructions or a Mozart biography, the textual results might also disappoint. But you’d just read over that noise, scanning the page for useful information. You’d assume a certain amount of irrelevant material and adjust accordingly. At no point would you be tempted to call Google Search a “bitch” for failing to serve up exactly the right knowledge at whim. And at no point would it apologize, like Alexa does.

Transcription works likewise. When I speak a text message into my iPhone, the device often transcribes it just as inaccurately as Alexa does. And I associate that failure with Siri, because Siri is the agent that I speak to when I speak to my iPhone—even though text transcription is a separate operating-system service. Because I am human and prone to anthropomorphizing, I blame Siri for the failure. And yet, when the device transcribes a telephone voice message and displays it on my device—just as inaccurately, mind you—I tend not to connect that failure to Siri, but to the genderless, faceless, infrastructural amalgam of AT&T and Apple. Maybe the best way to represent women as technological apparatuses is to avoid doing so in the first place.

Amazon disagrees. When asked to comment on my assertions that Alexa is actually a setback for feminism, a spokesperson offered this statement:

We’re 100 percent focused on our customers, and feedback from customers is that they love Alexa’s voice. When we developed Alexa’s personality, we wanted her to have a lot of attributes we value at Amazon, like being smart, helpful, and humble, while having some fun, too. These attributes aren’t specific to any one gender, rather traits we value in all people.

* * *

Quartz’s Fessler reports that the “disengage mode” Amazon created to parry sexist or abusive comments was driven by both customer feedback and a deliberate effort to curtail “inappropriate” interactions with Alexa. That’s a good move, and a positive step. But it pleads ignorance against the accusation that Alexa’s very conception is a setback for feminism in the first place.

Even if Alexa now trades pleasantries to a deaf ear in the face of abuse, and even if it spouts the right aphorisms about equity when directly asked about the topic, those steps can’t make up for the nature of its design: a countertop housemaid who promises to answer all questions and requests, while never being given the ability to do so effectively. That’s just a rehash of many of the basics of women’s subjugation, not a reprieve from it.

It’s a disorienting conclusion, because Alexa certainly appears to be more feminist than it used to be. And because Amazon has parlayed that appearance of improvement into positive press, which readers are likely to take as a symptom of the company’s broader commitment to equality writ large. As the writer Chris Dillow recently noted, when wealthy and powerful individuals and corporations send “virtue cues” like Amazon has done with Alexa, those signals tend to trickle down to the public as representative of real virtue.

But merely to signal virtue, rather than truly to live it, risks decoupling talk from action. When Amazon enjoys effusive praise for a version of feminism that amounts to koans and cold shoulders, then it can use that platform to justify ignoring the broader structural sexism of the Echo devices—software, made a woman, made a servant, and doomed to fail.

Ian Bogost is a contributing writer at The Atlantic.