Facebook has axed the editorial team responsible for writing the story descriptions that show up in the Trending Topics section, according to sources who spoke with Quartz.
The team of 15 to 18 contractors, hired via a third party, were notified of the termination at 4 p.m. then asked to leave the premises by 5 p.m. on the same day.
Some members of the editorial team were said to have been working on Trending Topics for a year and a half now.
Trending Topics: Human vs Machine Intelligence
With the editorial team fired, Facebook will now rely primarily on computer algorithms to vet the topics that are generating a buzz on the social network. This is only one of the many ways Facebook is deploying machine intelligence to understand how netizens are using the platform.
In place of the team of writers, engineers will oversee the results that the algorithm yields and determine whether these are newsworthy.
Facebook has long insisted its Trending Topics section is automated.
"Topics that are eligible to appear in the product are surfaced by our algorithms, not people," Justin Osofsky, Facebook's VP for Global Operations, explained in May, while recognizing that a team of human editors "play an important role" in ensuring the results are relevant and of high quality.
Former Trending Topics staff, however, alleged at the time that human editors and news curators were asked to select the topics manually, discarding stories that were deemed conservative or against their personal beliefs.
'Facebook Is A Platform For All Ideas'
In response to the allegations of political bias that erupted in May, Facebook vowed to limit its algorithm's dependence on some 1,000 news outlets to vet stories for the Trending Topics section.
Now, as the social media company continues to update the service, the site will remove the snippet that gives a preview to the topic and will only show the number of people talking about it, as seen in the picture below.
Like before, Trending Topics will determine which stories to feature based on the number of mentions and the sharp increase in the number of mentions over a short period.
By terminating the full team of human editors and news curators and introducing these changes, Facebook believes it will only need to make "fewer individual decisions about topics."
"Facebook is a platform for all ideas," the company claims, "and we're committed to maintaining Trending as a way for people to access a breadth of ideas and commentary about a variety of topics."
Can Facebook Really Rid Its System Of Human Bias?
The question is: how effectively can Facebook strip its system of human bias?
From the News Feed to targeted ads to Trending Topics, Facebook is increasingly developing its machine intelligence systems to gain better insights into its user base and audience, without relying too much on human intervention.
Often, however, machine learning also depends on cues, much like social signals, from human partners. These cues can be gleaned from patterns of human behavior.
The former editorial team — in examining the stories using their own judgment — may have been crucial in Facebook's development of a unique algorithm with a nose for news.
One study [pdf] shows how bias, evident in human language, can be imprinted on machine intelligence.
"Human-like semantic biases result from the application of standard machine learning to ordinary language — the same sort of language humans are exposed to every day," researchers from Princeton University and the University of Bath report.
Given these biases, even smart technologies can be used to "perpetuate" prejudice — in much the same way traditional news organizations carry their own biases too.
With more people turning to social media as a source of news, it's important now more than ever for Facebook to get its formula right.