Facebook Study Says Users, And Not Its Algorithm, Insulate Themselves From Diverse Opinions

While some might suggest that social media networks might be trying to push certain political agendas, according to a recent study, a user's Facebook News Feed looks the way that it does mostly because of users themselves.

The study was conducted by examining data from over 10.1 million Facebook users in order to find out how frequently they were exposed to political views different from their own.

"... our work suggests that the power to expose oneself to perspectives from the other side in social media lies first and foremost with individuals," concludes the study.

So, how exactly does this work? Well, instead of arbitrarily picking out political posts to include on the News Feed of a user, the likelihood of being exposed to political views other than our own depends mostly on our friends and what kinds of political views they have, as well as the links that we click on that push one political view over another.

What this means is that Facebook's algorithm for the News Feed really doesn't affect the political views that users are exposed to all that much. That does not, as you might realize, mean that a user will never be exposed to political views different from their own and the circle of friends that they keep.

"No matter how you split it, approximately 25 percent of the content that people are consuming are from the other side," said Facebook data scientist Eytan Bakshy, who conducted the study. "In the aggregate, we are exposed to different information."

In conducting the study, the team looked for "cross-cutting content," which essentially means content that opposes the view of the user. In other words, in the case of a liberal who sees a political story being shared by a conservative friend, the user is being exposed to cross-cutting content even if the story was not written with a conservative stance.

Interestingly enough, it seems as though liberals are a little more closed off than republicans on social media. While only 24 percent of hard news content on a liberal user's News Feed is shared by a conservative, 35 percent of hard news content on a conservative user's News Feed is shared by a liberal.

Of course, Facebook's algorithm doesn't account for nothing. Due to the News Feed algorithm, liberal users are exposed to eight percent less cross-cutting content, while conservative users are exposed to five percent less. This represents a rather modest change compared with what some conspiracy theorists suggest.

Image: Acid Pix | Flickr

ⓒ 2024 TECHTIMES.com All rights reserved. Do not reproduce without permission.
Join the Discussion
Real Time Analytics