Facebook is making a move to prevent child sexual abuse from its site and combat users who share and spread such content to others. This update will lessen the impact of sharing child exploitation contents, after the social media application found 13 million harmful videos and images in June and September 2020 alone.
Facebook features child sexual abuse prevention update
In an MUO report, the Facebook update comes as the social network faces pressure to combat such behavior from users amidst its plans to enable default encryption for messages on Facebook Messenger and Facebook-owned photo service Instagram.
In a post on the Facebook Newsroom, the social media platform clearly says that using their application to harm children is "abhorrent and unacceptable." The new feature will send a warning message to every user who searches for terms related to child exploitation.
Not only will a notification pop-up appear, but Facebook will include a link to an offender diversion program. A secondary notice also pops up and states that child sexual abuse is illegal and that viewing these images can lead to dire consequences, including imprisonment.
In addition to that, Facebook will remove the exploited content, store it, and report it to the National Center for Missing and Exploited Children (NCMEC). Figures from the NCMEC said that a 31 percent increase in the number of images of child sexual abuse was reported to the non-profit organization in 2020.
Facebook says that it's using the insights from this safety alert to help the platform identify behavioral signals of those at risk of sharing this material that will help the social media platform discourage this behavior in the future.
Also Read: Facebook Removes the Like Button on New Page Design: Here's What You Can See Instead
Safety alerts will notify users of exploitative content
The second feature Facebook is testing is an alert that will inform users if they try to share harmful images of child abuse. The safety alert informs users not to share this type of content again and will be banned from the social media site.
The company states that it is using this tool to educate users on why it is harmful and encourage them not to publicly or privately share this kind of content, as reported by CNet.
Moreover, Facebook updates its child safety policy and reporting tools, stating that it will pull down Facebook profiles, pages, and groups. Instagram accounts that share innocent images of children with captions, hashtags, or comments containing inappropriate signs of affection or commentary about the children depicted in the image.
Facebook users who report content will also see an option to let the social network know that the photo or video involves an innocent child, allowing the company to prioritize it for review.
Dailymail reports that Antigone Davis, Global Head of Safety at Facebook, said that their industry-leading efforts to combat child exploitation focus on preventing abuse, detecting and reporting content that violates our policies, and working with experts and authorities to keep children safe.
Davis continued that the new tools that Facebook is testing in keeping people from sharing content that victimizes children and has made an improvement to the social media site to make detection and reporting tools user-friendly.
Related Article: Coronavirus Update: Facebook to Alert Users Interacting With Fake News
This article is owned by Tech Times
Written by: Luis Smith