Siri, Stereotypes, and the Mechanics of Sexism

Research output: Contribution to journalArticlepeer-review


Feminized AIs designed for in-home verbal assistance are often subjected to gendered verbal abuse by their users. I survey a variety of features contributing to this phenomenon—from financial incentives for businesses to build products likely to provoke gendered abuse, to the impact of such behavior on household members—and identify a potential worry for attempts to criticize the phenomenon; while critics may be tempted to argue that engaging in gendered abuse of AI increases the chances that one will direct this abuse toward human beings, the recent history of attempts to connect video game violence to real-world aggression suggests that things may not be so simple. I turn to Confucian discussions of the role of ritualized social interactions both to better understand the roots of the problem and to investigate potential strategies for improvement, given a complex interplay between designers and device users. I argue that designers must grapple with the entrenched sexism in our society, at the expense of “smooth” and “seamless” user interfaces, in order to intentionally disrupt entrenched but harmful patterns of interaction, but that doing so is both consistent with and recommended by Confucian accounts of social rituals.
Original languageEnglish (US)
JournalFeminist Philosophy Quarterly
Issue number3
StatePublished - Dec 21 2022


  • etiquette
  • verbal abuse
  • feminine AI
  • Alexa
  • Siri
  • artificial intelligence
  • virtual assistants
  • Confucian ethics


Dive into the research topics of 'Siri, Stereotypes, and the Mechanics of Sexism'. Together they form a unique fingerprint.

Cite this