Facebook Wants To Save You From Violent Posts, Hires 3,000 More People To Fight Against Them

Facebook just announced that it wants to improve its video filtration efforts. It comes as the company's response to several reports that brought to light how disturbing content, such as suicide videos, remain undetected on the site for hours.

An Additional 3,000 People Will Monitor Facebook Reports

Facebook has now confirmed that it's going to add 3,000 more people to its operations team to screen videos published on the site, with the goal of responding quickly when it finds harmful content.

This will add to Facebook's current number of 4,500 people already working in such a capacity. Solely looking at the numbers, the forthcoming spate of additions is impressive, because it could finally lead to ample supervision and monitoring of inappropriate content shared on the social network. But a few crucial things remain unclear: are these new people full-time employees or contractors? How will they screen all videos published on the site?

Facebook is nearing 2 billion users. It receives millions of reports every week. Screening all those is a tall order, but CEO Mark Zuckerberg is hopeful, stating in a Facebook post:

"If we're going to build a safe community, we need to respond quickly. We're working to make these videos easier to report so we can take the right action sooner — whether that's responding quickly when someone needs help or taking a post down."

Working With Community Groups And Law Enforcement

Zuckerberg says these reviewers will aid the site in removing content that violates its policies. In addition, Facebook will keep working with local community groups and law enforcement if someone needs help, especially during times when they're about to harm themselves, or they are in danger from someone else.

Facebook is making wise steps. Adding more workers tasked with monitoring makes it more informed about potentially harmful content, something it wants to get rid off on the site. The company has been making its algorithm increasingly better, and in turn making it easier for people to submit reports should concerns arise. In March, the company outed suicide prevention tools. Shortly after, it took steps to prevent the plight of revenge porn.

Facebook's Role Is Becoming Harder To Identify

Facebook is the world's biggest social network, a platform where people can converse, engage, and interact with other people instantaneously around the world. But that accessibility comes at a price: not everyone on the site has benign intents. Some want to cause harm, and they want everybody to see it. Because of this, Facebook is in this highly crucial and debatable place of translating its role in definite terms.

Is it just a social network, or is it also a news site? If you see a news item on Facebook, does that imply that it should always be believed as accurate? What roles does it play in human communication, information dissemination, and media consumption? How can we measure, determine, and ultimately work to refine those experiences?

These are tall questions, and even Zuckerberg himself might not be able to answer even half of them. But it's a good thing we know Facebook also doesn't want what the majority of its users doesn't want: fake news, inappropriate content, revenge porn, recorded suicide attempts, and anything that turns the service more noxious.

ⓒ 2024 TECHTIMES.com All rights reserved. Do not reproduce without permission.
Join the Discussion
Real Time Analytics