Facebook has conducted a study to find out why most of the posts users see in their News Feed are posts which are in accordance with the user’s beliefs.
The study also wanted to find out whether the filter bubble has something to do with this. The filter bubble involves an algorithm which shows posts based on what you already liked, clicked and commented on. The company analyzed anonymous data from 10.1 million Facebook users who shared information about their political affiliation on their timeline. Researchers observed whether the news links were posted or viewed by liberals, conservatives or moderates.
The goal of Facebook’s in-house scientists was to see if the algorithm of the site has created a filter bubble which led to a political polarization. After analyzing the news content to see if the organizations that posted and the people who liked it were liberal or conservative, Facebook discovered for example that Fox News is slightly conservative, whereas New York Times is slightly liberal.
The study acknowledges that the bubble effect is real and it varies based on political affiliation. However the website’s algorithm does not play a major part in it. The friends which a user has have a greater impact on the News Feed. According to a blog post News Feed shows contents which are in accordance with a user’s own ideology based on his or her Facebook activity. This means that your list of friends and the content you click on has more impact on the contents you encounter in your News Feed than the News Feed ranking. Since friends tend to have the same age, geography, educational attainment and occupation it is normal that the same principle should be valid in the case of political affiliation too.
Researchers also examined the cases when liberal sources were read by conservatives and the other way around. By doing this the investigators were able to determine how often users look at stories which they are not likely to agree with. The findings indicate that the News Feed has the tendency to work as an echo chamber. People were 1% less likely to view stories which they did not agree with. It may not be much, but some critics consider it significant enough. Eli Pariser remarked:
“Certainly, who your friends are matters a lot in social media. But the fact that the algorithm’s narrowing effect is nearly as strong as our own avoidance of views we disagree with suggests that it’s actually a pretty big deal.”
Image Source: Sproutsocial