Facebook: A Political Dining Table

– Sameera S Vasista

It was the year 1686, when Sir Isaac Newton came up with his third law of motion which roughly stated, “For Every action, there is an equal and opposite reaction”, which is apt even to this age, just with an addition of “For Every action, there is an equal and opposite reaction, and social media’s overreaction”. Mr. Mark Zuckerberg presented Facebook, which at its outset served as a platform for friends and family to share what was going on in their lives. As a wise man once said, don’t discuss politics and religion at the dinner table, Zuckerberg was clear in his verdict on Facebook’s political dining table. But what went wrong as time progressed? How did the social media giant shift its focus from connecting people to spreading political news, which while some label fake, copious amounts of people term as biased?

download

So, how is Facebook filtering politics?

Over the past couple of years, many Facebook news-feeds have turned into “Trumpbook”, with a stream of outrage-focused political posts fueling Facebook interactions, especially during election cycles. Facebook first claimed that its fake news issue was “insane”, then repeatedly stated that its algorithm could filter fake news and it has a war room where a swarm of contractors analyze ads and posts for validity. With recent advancements in machine learning, it is now possible to algorithmically filter out political news and advertisements using off-the-shelf machine learning tools. So in light of these advancements, why doesn’t Facebook offer an option to filter out political posts?

The first reason is Facebook has expanded beyond what it was created for, beyond actual friends and conversations, beyond posting personal and meaningful posts. People have devolved into promotions, fund raising, fake events and have made it a platform to discuss current happenings, politics being one of them, which is sensitive and biased.

In late October, Facebook announced significant changes to misinformation policies, saying it will stop running political ads in the United States after polls close on 3rd November for an undetermined period of time. In other words, Facebook stopped  new advertisements one week before 3rd November, and immediately after polls closed it stopped running all political advertisements indefinitely, by doing this Facebook tried to avoid another political disaster after it was found that Facebook was used by Russian operatives in 2016 to manipulate the United States elections.

As Zuckerberg mentioned, one common piece of feedback that they receive is that people don’t want political content to take over their News Feed. So, they’ll work better to understand people’s varied preferences for political content and test a number of approaches based on those insights. As a very first step Facebook temporarily reduced distribution of political content in News Feed for a small percentage of people in Canada, Brazil and Indonesia and the US. During these initial tests, they explored a variety of ways to rank political content in people’s feeds using different signals, and then decided on the approaches to be used going forward. 

1177753337.jpg.0

Depending on how distant Facebook’s revamp goes, the reduction in the visibility of political content could disarray the ecosystem of online activism, publishing and advertising that has grown up around the social network and its reported 2.8 billion monthly users. It also could reanimate complaints, largely from conservatives, that the company is stifling political speech.

Can Mark Zuckerberg be trusted to take politics out of Facebook? Zuckerberg claims he wants to fix the political polarization Facebook helped bring about but that may go against his own ideology.

References:

www.vanityfair.com

www.theguardian.com 

Leave a Reply

Your email address will not be published. Required fields are marked *