Facebook now lets people in groups become "experts," which then amplifies their posts substantially.
The basis comes from the idea of getting "knowledgeable experts" who then can serve as authoritative figures. The move is seen as an attempt for Facebook to divert possible blame and work to users instead of themselves.
Facebook Gives More Power To Its Users
The people who run the communities on Facebook now have the authority to promote individuals within its group to gain the title of "expert."
Then, the individuals dubbed as experts can be the voices of which the public can then base their questions and concerns. This is
to prevent misinformation plaguing online communities for a while now.
The move is eerily similar to Facebook's creation of the oversight board or known to most users at the "supreme court." The company has faced massive backlash and pressure on how strict its moderators regarded potentially harmful content, and how they didn't take complete responsibility.
Read More : Facebook's Latest Feature Will Let You Know if Your Post Was Removed by an Automation or a Content Reviewer
Similar Methods Introduced Before
With the introduction of Facebook's oversight board, it spent an estimated $130 million into designating content that would potentially be harmful.
The board was outsourced and let people review Facebook's decision, which did not end well.
In May, the group then filed a case against Facebook that they should do their work and stop being "lazy" and make its own rules regarding the matter, based on a report by Business Insider.
Now, the case for Facebook's recent attempt to shift responsibility away from them would be to assign "experts" within groups who can be held accountable for what they say and possibly incite.
While the oversight saw the flaw in this, it might be the same with the "experts" feature.
Potentially Negative Consequences
The social platform now empowers specific individuals inside groups who are devoted to solely spreading misinformation-related topics.
The "Stop the Steal" group, for example, was created in November 2020 with over 365,000 members. They were convinced that the election for the presidency was a fraud.
If Facebook didn't remove the group two days later, it would continue to have negative effects.
Facebook explained that the organization talked about "the delegitimization of the election process," and called for violence, as reported by the BBC.
Even before that, other groups within Facebook promoted violence and calls to action that would harm the civility of the governments.
Facebook Faces Worrying Concerns
Facebook has said in its announcement that there are already over 70 million admins and moderators that operate throughout the world.
As moderating is not an easy job, Facebook's expert tool might help ease some problems between spreading misinformation. However, there is still no fool-proof operation that can finally curb misinformation towards online platforms.
The question raised is how Facebook will be held accountable for giving the power for people to elect their experts, and how the company would respond towards them if found guilty of spreading misinformation themselves.
This article is owned by Tech Times
Written by Alec G.