Sony Research Finds AI To Be More Prone to Inaccuracies When Used on Dark-Skinned People

Existing skin tone scales might be replaced to eliminate AI biases.

Sony Research Finds AI To Be More Prone to Inaccuracies When Used on Dark-Skinned People
Sony wants red and yellow skin hues to be included in skin tone scales to address inaccuracies in detecting skin colors. Zyanya BMO from Unsplash

AI algorithms tend to face biases when it comes to skin tones, a new research from Sony claims. Beyond the traditional lightness or darkness spectrum, the experts want to include red and yellow skin hues to eliminate potential inaccuracies in future studies.

Skin Color Biases in AI Systems

Even after a few years, AI systems remain questionable for researchers. Potential biases are still present when it comes to skin color. Joy Buolamwini and Timnit Gebru wrote in a 2018 study that AI posed inaccurate findings on darker-skinned females. This pushed companies to improve AI algorithms when it comes to a range of skin tones.

Just last Sept. 10, Sony AI researchers William Thong and Alice Xiang published a study about the "multidimensional" measurement of skin color. The study is also made possible with the help of Przemyslaw Joniak from the University of Tokyo.

Google's 10-point scale which is popularly known as the Monk Skin Tone Scale evaluates an array of skin tones from dark to light. Another commonly used measure is the Fitzpatrick scale, consisting of six categories, as employed in previous Meta research.

However, Sony's research reveals a fundamental issue with existing scales. Both the Monk Skin Tone Scale and the Fitzpatrick scale primarily focus on the lightness or darkness of skin tone. According to Alice Xiang, Sony's global head of AI Ethics, this narrow focus allows many biases to go undetected and unaddressed.

As noted in its blog post, these scales fail to consider biases against groups such as East Asians, South Asians, Hispanics, Middle Eastern individuals, and others whose skin tones do not neatly fit within the traditional light-to-dark spectrum.

For instance, Sony's research uncovered that common image datasets disproportionately feature individuals with lighter, redder skin tones, while underrepresenting those with darker, yellower skin tones. This skew can lead to reduced accuracy in AI systems.

"Our hope is that the work that we're doing here can help replace some of the existing skin tone scales that really just focus on light versus dark," Xiang says in an interview with Wired.

The company found that Twitter's image-cropper and two other image-generating algorithms exhibited a preference for redder skin tones. Additionally, some AI systems mistakenly classified individuals with redder skin hues as "more smiley."

How Sony Will Address Skin Tone Biases in AI

Sony's innovative solution suggests adopting an automated approach based on the existing CIELAB color standard, per The Verge. This approach would eliminate the need for manual categorization, addressing the limitations of the Monk Skin Tone Scale.

The Monk Skin Tone Scale, created by Ellis Monk, is intentionally streamlined with 10 skin tones to ensure diversity without introducing inconsistencies associated with more categories. It has faced some criticism regarding the consideration of undertones and hues.

Monk clarifies that research was devoted to deciding which undertones to prioritize along the scale and at which points. He believes the scale adequately considers these factors.

While Sony's approach provides a more multifaceted perspective on skin tones, it is crucial to ensure that any new measures maintain simplicity to be practically effective. Monk himself emphasized the cognitive challenges associated with managing an excessively detailed scale.

In other news, another research from the University of Melbourne discovered that AI was biased towards parents in the workforce.

Joseph Henry
Tech Times
ⓒ 2024 TECHTIMES.com All rights reserved. Do not reproduce without permission.
Join the Discussion
Real Time Analytics