Facebook has reported BBC to the police after the news outlet reported that some Facebook users exchange images of child abuse through online groups.
BBC conducted an investigation, and upon finding sexualized images of children on the social network, it reported them using Facebook's own moderation system. However, BBC says that Facebook failed to remove dozens of such images.
According to BBC, Facebook removed only 18 out of the 100 photos reported, saying that the rest of them did not pose any violations to its community standards. Seeing that 82 sexualized images of children were still in place even after being reported, BBC says it contacted the social network and set up an interview for the following week.
Facebook Reports BBC To Police
In an unexpected twist, Facebook struck back and did some reporting of its own.
"When provided with examples of the images, Facebook reported the BBC journalists involved to the police and cancelled plans for an interview," says BBC.
"It subsequently issued a statement: 'It is against the law for anyone to distribute images of child exploitation.'"
Damian Collins, chairman of the Commons media committee, says he has "grave doubts" regarding the effectiveness of Facebook's content moderation systems and that it's extraordinary that it reported BBC to the police when the media outlet just wanted to help keep the network clean.
Convicted Pedophiles On Facebook
Moreover, BBC points out that while Facebook does not allow convicted sex offenders to have an account on the social network, it found five convicted pedophiles with profiles. BBC reported those profiles as well, also via Facebook's own system, but the social network did not remove any of them.
As Collins further highlights, this occurrence makes one wonder what would be the most effective way to report disturbing content to Facebook and see it actually taken down as a result.
That's not to say that any post that gets reported should be removed, as there are also many cases in which the content is not really offensive or violating community standards in any way. In cases involving child pornography and convicted pedophiles, however, Facebook should take action and clean up the place.
BBC Investigation Into Facebook Moderation Practices
For some background to this whole mess, here's the deal: BBC first asked Facebook for an interview regarding its content moderation practices back in 2015. BBC kept investigating, and in February 2016, it revealed how pedophiles use Facebook groups to exchange pornographic images of children. At the time, Facebook pledged to improve its moderation system, but BBC continued to investigate to see if the promise materialized.
Last week, Facebook's director of policy, Simon Milner, finally agreed to an interview provided that BBC came up with examples of the material it reported and moderators failed to remove.
That brings things to this point — BBC provided those examples, and Facebook reported it to the UK's National Crime Agency for sending sexualized images of children.
Facebook subsequently offered a statement to BBC, informing the media outlet that it has since removed all files that were illegal or violated its standards. At the same time, the social network said it's working on further improving its moderation system for reporting and takedown measures.
"When the BBC sent us such images we followed our industry's standard practice and reported them to CEOP," added Facebook, with CEOP standing for the Child Exploitation and Online Protection Centre.