Facebook is working on fighting toxic content in groups, some of which have become a hotbed of hate and disinformation.
Tools To Curb Toxic Content On Facebook Groups
In a blog post published on Aug. 14, Facebook's head of groups Tom Alison introduced new moderating measures for groups.
These include giving group admins improved overview of posts that Facebook has flagged or removed for violating Community Standards.
To curb misleading and fake content, Alison said that Facebook added a section about false news posted in the group.
Alison explained that this will give admins more clarity about how and when Facebook enforces policies in their group, and make it easier for admins to see what is happening in their community.
Facebook also added a section for rules that will make it clear to members what is and is not allowed in the group. Admins and moderators will likewise have the option to share which particular rule a member broke when they decline a pending post, remove a comment, or mute a member.
Members of the group will also have more transparency and control. Individuals who join a group will have an idea about the type of community they are will be a part of based on visible relevant information about the group, which include who the admin and moderators are, and if the group has used other names in the past.
Facebook users can also preview a group that they were invited to join and have the option to accept or decline the invitation.
Hotbed Of Misinformation And Inappropriate Content
Facebook introduced the new tools amid concerns that some hidden Facebook groups have become a breeding ground for misinformation and inappropriate content. Some groups have become a source of fake news and dangerous medical hoaxes
"Through the Safe Communities Initiative, we'll continue to ensure that Facebook Groups can be places of support and connection, not hate or harm," Alison said. "There's always more to do, and we'll keep improving our technology, tools and policies to help keep people safe."