The cool, calm, and all-knowing voices of your Amazon Alexa or your iPhone’s Siri may provide plenty of fodder for jokes about the incoming dystopia, but it turns out your “smart” home and speaker devices may be bringing us closer to more troubling future than we realize. A recent study by the United Nations’ Education, Scientific, and Cultural Organization (UNESCO) has concluded that the continued prevalence of female-voiced “smart home” devices ultimately reinforces and entrenches negative gender biases.
The report, titled “I’d Blush If I Could” (named such after Siri’s programmed response to being called a particular misogynist slur) makes the claim that the function of these devices and the presentation of their personalities – programmed and engineered by teams comprised mostly of men – promote and reinforce sexist gender stereotypes regarding the idea of women as submissive and over-accommodating.
UNESCO claims that female-gendered smart home devices such as Siri or Microsoft’s Cortana are, as their submissively-programmed personalities pop up in homes across the globe, increasingly normalizing a specific stereotype of the female identity through being designed to be easily “available at the touch of a button or with a blunt voice command like ‘hey’ or ‘OK.’”
Through presenting these gendered forms artificial intelligence as ones that hold “no power of agency beyond what the commander asks of it,” but rather “honours commands and responds to queries regardless of their tone or hostility” with “deflecting, lacklustre or apologetic responses,” UNESCO claims that this technology comes to “reinforce commonly held gender biases that women are subservient and tolerant of poor treatment.”
The conclusions reached by the report are troubling, especially considering that it doesn’t look like smart home technology will be going anywhere anytime soon (the report claims that currently, roughly a billion tasks per month are carried out via this technology ). UNESCO does provide a solution, though, and a refreshingly simple one at that: diversify.
With the report pointing out that women only make up 12% of A.I. researchers, it goes on to highlight that “technology reflects the values of its developers and that of the information they draw from.” Thus, when it comes to stopping the sexism problem in the smart home industry, it concludes “that having more diverse teams working in the development of such technologies might help in identifying biases and prevent them.”