Mark Zuckerberg and Facebook Bosses Ignored Research Findings on Instagram's Racist Algorithm

Facebook researchers claim Mark Zuckerberg and other Facebook bosses ignored the rules they proposed to address the alleged bias on Instagram's automated account removal system that seemed to target Black users.

According to an NBC News article on Thursday, July 23, Gizmodo reported that the higher-ups even told researchers who requested anonymity to stop doing any research about racial bias on Facebook's moderation tools.

The issue rooted in Facebook's attempt to make its automated moderation systems neutral by creating an algorithm that is equivalent to "You know, I don't really see color."

Using the same tools, less hate speech posts against marginalized groups, including Black, transgender, and Jewish users, were proactively removed by Facebook than those reported by users. This means that while certain posts are deemed offensive, Facebook's automated tools were not detecting them.

Instead of using the proposed rules were eventually scrapped while Instagram sorted to using an updated version of the moderation tool. Employees were even banned from testing it on the revised tool.

Facebook's response to the report

Facebook claimed that the researchers used a flawed methodology. However, it did not deny that it issued a moratorium on probing racial bias in its moderation tools. In an interview with NBC, Facebook's VP of Growth and Analytics, Alex Schultz said they made such decisions based on ethics and methodology concerns.

Facebook said that it was currently looking for better methods of testing its products for racial bias. In his interview with NBC, Schultz said that racial bias on Facebook's platforms is a "very charged topic," but the company has largely increased its investment in investigating algorithmic bias and its effects on moderating hate speech.

The company announced earlier this week it had created new teams to investigate racial impacts on its programs. These employees will compare how Black and minority users, as well as white users, are affected by Facebook and Instagram algorithms.

Facebook spokeswoman Carolyn Glanville said in a statement that the company is "actively investigating how to measure and analyze internet products."

Glanville said that leaders sought a "standard and consistent approach" to prevent a biased and negligent work, so they set up a project to do that.

ⓒ 2024 TECHTIMES.com All rights reserved. Do not reproduce without permission.
Join the Discussion
Real Time Analytics