There are two knowns when it comes to Facebook: first, the social network collects enormous amounts of data every day from users and, second, whenever it decides to publish a study, it is bound to draw a great deal of scrutiny.
Academics are fast falling in line to comment on, criticize and question Facebook's new study, as well as the company's guidelines surrounding the collection, parsing and publishing of data. Some are even calling on CEO Mark Zuckerberg and company to be much more transparent about what it is studying and why.
The study, "Exposure to ideologically diverse news and opinion on Facebook," which was published in the journal Science, had not even been live for a day when it began to catch flak.
In the abstract, the research data scientists from Facebook stated the focus was determining how online networks influence exposure to users' perspectives that "cut across ideological lines," and how 10.1 million Facebook users interact with socially shared news.
The outcome: individuals have more influence than the Facebook algorithmic ranking of content and News Feed activity. However, not everyone is agreeing that this is actually the result as it seems to work in Facebook's favor.
Some view Facebook's News Feed as a "filter bubble" that allows or prohibits content to reach users based on content sharing and clicking behavior.
"People who self-identify their politics are almost certainly going to behave quite differently, on average, than people who do not," wrote Zeynep Tufekci, professor at the University of North Carolina, Chapel Hill, in her critique.
Last year, a study by Facebook regarding mood impact relating to positive and negative content drew loud criticism for how it was conducted and its lack of transparency in sharing the study about its users.
This most recent study, on the other hand, is too limited, claim critics, as it focuses on users who publicly reveal political affiliation on Facebook, and that group of users is not a reflection of the Facebook population as a whole. Also, as the study was done by Facebook data scientists, it needs to be taken with a bit of salt, as the saying goes.
"The study is interesting; I'm thrilled they're publishing this stuff," said Tufekci. "But who knows what else they found?"
Facebook must be much more transparent in stating who is conducting the research, who crafted the research reports and how they were independently reviewed, believes social scientist Christian Sandvig.
Sandvig noted that the study revealed how Facebook's News algorithms do constrict article and content diversity but that the researchers do not go the next needed step in indicating if that is good or bad for the user.
"So the authors present reduced exposure to diverse news as a 'could be good, could be bad' but that's just not fair. It's just 'bad.' There is no gang of political scientists arguing against exposure to diverse news sources," stated Sandvig.
"If a study is published in Science, and all three authors work for a pharmaceutical company and it says something positive about that company, we have a way to think about that," Sandvig added.