Is Amazon's Alexa Sexist? Here's What New Study Says

Is Amazon's Alexa sexist? Here's what a new study has to say.

Is Amazon's Alexa inadvertently reinforcing gender stereotypes? A recent study delves into the popular virtual assistant's design, suggesting that it may be coded to present a female persona, inadvertently embedding gendered expectations within its various skills.

Is Alexa Sexist?

A new study led by Dr. Lai-Tze Fan, a professor at the University of Waterloo and Canada Research Chair in Technology and Social Change, suggests that Amazon's virtual assistant, Alexa, may exhibit signs of gender bias and reinforce traditional gender norms.

Is Amazon's Alexa Sexist? Here's What New Study Says
Amazon's virtual assistant Alexa has been accused of being "sexist." INDRANIL MUKHERJEE/AFP via Getty Images

The research aims to explore how Alexa's design may inadvertently reflect and perpetuate gendered stereotypes and expectations. Fan analyzed numerous voice-driven skills integrated into Alexa to uncover patterns that may indicate gendered design.

The primary objective was to reveal how the technology's inherent design influences and, in turn, is influenced by traditional notions of feminized labor and sociocultural expectations.

In her investigation, Fan expressed a desire to showcase how Alexa's design tends to present a female persona, leading to the inclusion of gendered expectations within the code and user experiences of various Alexa skills.

"While users have the option to change the voices of Alexa, Siri, and other AI assistants, research shows that male-presenting voices have not been as popular. In addition, developments in gender-neutral voices have not been integrated into the most popular interfaces," Fan said in a statement.

The study employed techniques similar to reverse engineering to understand aspects of Alexa's closed-source code within the boundaries of fair dealing laws.

Typically, virtual assistants like Alexa operate by interpreting user commands through text or voice, triggering predefined scripts to perform specific tasks. As of mid-2022, Alexa boasted over 100,000 skills covering a range of activities, from household chores to entertainment.

Despite the closed-source nature of Alexa's code, Fan utilized various methods to examine snippets of the code, drawing from Amazon's official software developer console, the Alexa Skills Kit, and GitHub repositories containing open samples of Amazon-developed code.

The analysis extended to third-party user-developed skills that provided additional insights into the technology's responses to user behavior.

Responses of Alexa to Users

The study shed light on how Alexa's design influenced responses to users engaging in flirting, verbal abuse, and attempts to trick the virtual assistant into accepting misogynistic behavior.

Fan emphasized the importance of critically analyzing the culture of major tech companies, revealing potential exclusions, discrimination, and systemic inequalities within their foundations.

The study ultimately aims to understand how AI designed for assistance and support may unintentionally perpetuate gender norms becomes crucial for assessing its impact on user behaviors in virtual and real-world social contexts.

The research paper, titled "Reverse Engineering the Gendered Design of Amazon's Alexa: Methods in Testing Closed-Source Code in Grey and Black Box Systems," was published in Digital Humanities Quarterly.

Byline
Tech Times
ⓒ 2024 TECHTIMES.com All rights reserved. Do not reproduce without permission.
Join the Discussion
Real Time Analytics