Facebook Moderators At Risk After Security Flaw Exposes Their Identities To Suspected Terrorists

Facebook accidentally leaked the identities of some of its content moderators to terrorists, new reports say.

The leak affected more than 1,000 employees from 22 departments at Facebook, specifically those who used the site's content review tools to monitor and remove content that violates the company's policies. Some of the types of content Facebook asks moderators to remove include sexual imagery, hate speech, and terrorist material and propaganda.

Facebook Accidentally Leaks Profiles To Terrorists

As The Guardian reports, a software bug discovered in late 2016 sent content moderator profiles as notifications to the activity logs of Facebook groups believed to be administered by terrorists tied to Islamic State, Hezbollah, and the Kurdistan Workers Party. The administrators of those said groups had been banned from the site for breaching the terms of service, but the remaining administrators of those groups were able to view the leaked profiles.

About 40 from the 1,000 affected individuals were part of a counter-terrorism department at a Facebook outpost in Ireland. About six of those 40 people were determined to be "high priority" victims of Facebook's security mishap after the company confirmed that terrorists likely viewed their profiles and identities.

The bug lasted for a month before Facebook eventually corrected it in November. Facebook told The Guardian that it applied changes to the software to avoid similar mishaps in the future.

Has Facebook Done Enough To Correct The Leak?

However, one of the affected moderators says that Facebook hasn't done enough, and that he fears potential pushback from the terrorists he banned from the site as part of his job. Fearing for his safety, the moderator decided to leave his job altogether and go into hiding, according to The Guardian.

"Community operations analysts," or as The Guardian puts it, "low-paid contractors," are asked to screen content shared on Facebook and look for potential violators. The moderator who went into hiding and others who had the same job as him first noticed something was wrong when they began getting friend requests from people with ties to terrorist organizations.

Facebook's security team then led an investigation which prompted it to discover that the personal profiles of the moderators had been exposed, then the team later also found out that the profiles were delivered as notifications to the terrorist groups.

Facebook wasn't able to fix the software bug until about two weeks later, during which time the bug had exposed personal profiles of other content moderators who had handed down censorships that stretch back to August 2016.

How Facebook Ensured The Safety Of High Risk Victims

To ensure the safety of the high priority victims, Facebook offered to provide an alarm system for their homes and take care of their transport when going to and leaving from work. The company also offered them counseling via its employee assistance program, on top of the counseling offered by the moderators' contractor.

In an email to a Facebook executive, the moderator who fled Ireland expressed that he, along with the high risk victims, had spent weeks "in a state of panic and emergency," noting that Facebook had been insufficient in ensuring their safety and that of their families.

This month, he filed a legal claim against Facebook and the recruitment company to receive compensation for the psychological damage caused by Facebook's unfortunate mishap.

But Facebook says that its probe "found that only a small fraction of the names were likely viewed," and that there had been zero evidence of threats to affected moderators or their families.

Facebook is now determining if moderators can use administrative accounts which aren't tied to personal profiles as a result of the leak. Why Facebook didn't integrate such a method from the beginning remains a question.

Thoughts about the leak? Do you think Facebook had done enough to ensure the safety of the six high risk victims? As always, if you have any thoughts or opinions, feel free to sound them off in the comments section below!

ⓒ 2024 TECHTIMES.com All rights reserved. Do not reproduce without permission.
Join the Discussion
Real Time Analytics