Racially Biased AI Poses Significant Risks, Particularly in Facial Recognition Tech

Dr. Christian challenges the misconception that technology is inherently unbiased, asserting that this belief is not accurate.

University of Alberta Faculty of Law Assistant Professor Dr. Gideon Christian has issued a warning about the harmful consequences of racially biased artificial intelligence (AI) in a press release from the institution.

NEC Corp of Japan Opens Cashier-less Convenience Store
TOKYO, JAPAN - FEBRUARY 28: An employee demonstrates the facial recognition system for the NEC SMART STORE at the NEC Corporation headquarters on February 28, 2020 in Tokyo, Japan. NEC Corp. opened the NEC SMART STORE, a cashier-less convenience store this February to showcase their smart retail systems including a camera and image recognition technology installed in the store that allows customers to make payments without going through cashiers. Tomohiro Ohsumi/Getty Images for NEC Corporation

Risks in AI Facial Recognition Technology

Dr. Christian has been awarded a $50,000 grant from the Office of the Privacy Commissioner Contributions Program for his research project titled "Mitigating Race, Gender, and Privacy Impacts of AI Facial Recognition Technology."

This study emphasizes that biased AI not only misleads but can also have severe detrimental effects on people's lives, recognized as an expert in AI industry and its legal implications.

Dr. Christian challenges the misconception that technology is inherently unbiased, asserting that this belief is not accurate. He specifically highlighted the concerning impact of facial recognition technology, particularly on individuals of color.

He provided additional insight, explaining that technology has a track record of mimicking human biases. In certain facial recognition systems, it achieves remarkable accuracy, exceeding 99 percent when identifying white males.

However, this situation becomes distressing when it pertains to the identification of individuals with varied skin tones, with a particular focus on Black women. It is within this context that the technology displays its most alarming error rate, soaring to approximately 35 percent.

Extending Outside US

Christian's warning extends beyond the borders of the United States, suggesting that Canada may also grapple with similar incidents of misidentification. This covert application may lead to a dearth of records or limited public awareness of its prevalence within the country.

Within Canada, specific cases have emerged, notably involving Black women and immigrants who have secured refugee status.

In some of these cases, the revocation of refugee status was predicated on facial recognition technology matching these individuals with others. The government's argument rested on the assertion that these individuals had initially made their claims using false identities.

Importantly, these incidents disproportionately affected Black women, a demographic group most adversely affected by the technology's highest error rates. Christian clarified that the issue does not lie within the technology itself, which isn't inherently biased.

As per the official press release, the outcomes produced by the technology are a direct reflection of the data it has been exposed to during its training process.

Addressing These Issues

Christian expressed the desire for a society free of racial bias and discrimination, emphasizing that this principle guides his research, particularly in the realm of AI.

Interesting Engineering reported that he voiced concerns about the possibility of artificial intelligence technology inadvertently perpetuating racial biases that society has worked tirelessly to combat. He issued a warning that failing to address these biases could potentially erode the progress made over the years.

Christian highlighted that racial bias is not a novel challenge, but what sets it apart is how these biases manifest in artificial intelligence technology.

If left unchecked, he cautioned that this technology and its associated problems have the potential to undo the advancements achieved through the civil rights movement.

Related Article: China Proposes Measures for Facial Recognition Use, Demands 'Individual Consent'

Written by Inno Flores
Tech Times
ⓒ 2024 TECHTIMES.com All rights reserved. Do not reproduce without permission.
Join the Discussion
Real Time Analytics