With the general elections in India due early in 2019, greater attention is being paid to the digital space, which many believe will have a deep impact on the polls. Much of this attention is focused on Facebook, which was heavily criticised for its conduct in the run-up to the 2016 United States presidential elections after it emerged that political consultation firm Cambridge Analytica had accessed the private information of 87 million Facebook users world-wide for targeted political advertising.
Since August, media reports in India have quoted several journalists accusing Facebook of censoring political content. The internet company has been accused of doing this by temporarily suspending accounts, labelling news as “spam”, and not permitting news organisations to promote their articles. But what sort of material invited such action by Facebook, and is the company picking sides as some have alleged?
Facebook started off simply as a way for people to connect. But it quickly grew into a massive public square, with the company having to decide what kinds of speech it would allow on its platform and what would get taken down.
However, many say Facebook’s content moderation policy – some of which is automated – is a never-ending series of challenging questions.
In response to Scroll.in’s emailed queries seeking clarity about this process, Facebook explained that accounts can be flagged for “inauthentic behaviour”, including repetitive posting of the same content, location mismatch, frequent name changes.
It added:
“Our systems are placed on high alert to prevent coordinated inauthentic behaviour on the platform ahead of elections. This protocol is in place for election-bound countries such as Brazil, US mid-term elections, Mexico, Italy and India. When operating at scale, some pages and accounts are flagged wrongly. However, we have an appeals process in place whereby such wrongly flagged accounts can appeal our decision.”
The company went on to say, “Suppressing content or preventing people from seeing what matters most to them is simply contradictory with Facebook’s mission. In order to maintain an open and safe environment on Facebook, we have global Community Standards that describe what is and is not allowed on our service. These standards do permit open and critical discussion of people who are featured in the news or have a large public audience based on their profession or chosen activities.”
Suspension of accounts
In September, the Facebook accounts of 10 journalists were temporarily suspended, The Telegraph reported last month. The news report said the accounts were locked out of Facebook for posting anti-establishment content.
Among those locked out were Rifat Jawaid and Suresh Kumar of Janta Ka Reporter, a website that focuses on current affairs. They told Scroll.in that the organisation’s account as well as their own accounts were disabled and “banned for 2-3 days” until they emailed identity verification documents to Facebook. Jawaid said his account was suspended minutes after he posted an article critical of the Supreme Court’s decision on September 27 to not revisit its earlier observation in the Ayodhya case that a mosque is not an integral aspect of Islam. However, Facebook’s communication of the account’s suspension did not specify if that post was the reason.
Other journalists named in The Telegraph report cited similar instances of their pages being suspended until they provided identification documents.
Facebook has of late been accused of helping spread fake news and has consequently taken steps to crack down on what it deems suspicious activity. One such step is to ask users to submit their government identification documents to unblock accounts that were suspended for various reasons.
In its statement to Scroll.in, Facebook spoke of the report in The Telegraph:
“The six pages reported by The Telegraph on October 8 were disabled for two reasons. Three for posting repetitive content from different handles. As soon as these accounts appealed, they were guided to have one authentic name and they were reinstated in 24 hours. We are apologetic for removing three pages as we were testing an Election Integrity product and were flagged wrongly. The moment they appealed, they were reinstated in 24 hours.”
There have been cases of accounts with political content being suspended in the past as well. In October last year, Scroll.in reported Facebook’s temporary suspension of a journalist’s account for sharing a photograph of a cash memo with this message at the bottom: “Kamal ka phool hamari bhool.” (It was our mistake to vote for the lotus). The lotus is the election symbol of the Bharatiya Janata Party, which is in power at the Centre, and the post was intended as a dig at the government.
When asked why it had suspended the account, Facebook had told Scroll.in then that the post revealed bank account details, which goes against its privacy standards. It added:
“Suppressing content or preventing people from seeing what matters most to them is contradictory to our mission. Facebook’s Community Standards exist to help keep our community safe and free from abusive behaviour, including fake accounts, hate speech and bullying and harassment. To protect the privacy of our community and prevent fraudulent activity, our policies also prohibit sharing of bank account details. We allow people to use Facebook to challenge ideas and raise awareness about important issues, but we will remove content that violates our Community Standards. We have real people looking at reported content, and it doesn’t matter how many times a piece of content is reported, it will be treated the same. One report is enough to take down content if it violates our policies, and multiple reports will not lead to the removal of content if it meets our Community Standards.”
It is still unclear why the suspension was issued for 30 days.
Some political parties have also accused Facebook of suspending their accounts temporarily over criticism of the government. Earlier this month, the Trinamool Congress complained that two of its Facebook pages – Trinamool Community Supporters and Trinamool Congress Community Force, with a combined following of 5,00,000 – and a WhatsApp group were temporarily suspended. It claimed the BJP, threatened by the Trinamool Congress’ growing social media presence, was behind the suspensions. Scroll.in unsuccessfully tried to contact the Trinamool Congress for additional details.
The social media platform has in the past also been accused of taking down material posted in Kashmir, especially if it was about the conflict in the state. Some of this action may fall into content restriction requests from the government, which has rules about the kinds of speech that are permitted in the Valley. A leaked Google document from earlier this year revealed that Facebook had been complying with authorities’ demands for posts to be taken down.
Marked as ‘spam’
Another problem some newsrooms voiced was Facebook’s marking of news articles as spam. Ajay Prakash, page manager of Janjwar, a Hindi news website, complained that this had happened to some of their articles over the last six months.
“Around August, we posted an article about Atal Bihari Vajpayee’s SPG [Special Protection Group] security head VN Rai which was marked as spam by Facebook,” Prakash told Scroll.in, recalling one instance. He added that he had written to Facebook, asking that it re-evaluate its markings.
The marking of stories on the government as spam has occurred in the past too. In 2015, The Wire reported that Facebook had marked their story on Prime Minister Narendra Modi’s visit to the United Kingdom soon after the 2002 Gujarat riots as spam. The report went on to say that the story was restored after a Facebook spokesperson admitted that it had been “mistakenly captured” by the company’s spam filter.
More recently, the platform was found to be marking stories about a data breach at Facebook itself, which had led to the personal information of 50 million users being exposed, as spam. In September, several journalists reported their inability to share articles from The Guardian or the Associated Press as the articles were being flagged as spam by Facebook’s filter.
In another tweet, a professor from Syracuse University showed analysis of the Associated Press and The Guardian stories had low traction on Facebook. The professor added that this could not be explained as a “spam filter” problem and the company should investigate why this was happening.
Facebook uses a number of tools and filters to flag material as spam while a Community Standards policy guides its community moderation. Earlier this month, Facebook announced that it had removed more than 800 political pages and accounts, all in America, for “spam and coordinated inauthentic behaviour”. But questions have been raised about how carefully the company examined the pages, with some claiming it amounted to censorship.
Boosting request rejected, then accepted
In August, The Caravan, an Indian magazine, published an article claiming that Facebook had declined its request to boost its story – which involves spending money to promote a story – on BJP national president Amit Shah mortgaging two of his properties to warrant a credit facility loan from a cooperative bank for his son Jay Shah’s firm Kusum Finserve LLP. “This was the first time our request was declined for boosting a post on Facebook,” Surabhi Kanga, web editor at The Caravan, told Scroll.in.
When The Caravan wrote to Facebook, the company replied that the post “doesn’t follow Facebook’s advertising policies”. Facebook also said it had declined the request because it seemed like it was for boosting “housing, employment or credit opportunities, or you’ve included a multicultural affinity segment in your audience”. It added that the magazine would have to certify that it would comply with Facebook’s policies prohibiting discrimination and with anti-discrimination laws. The review process would include any disapproved advertisements from the past three days, the company went on to say.
The Caravan appealed against Facebook’s decision but did not get a response till 10 days later when the company told the magazine that it had approved the article promotion and the boost was permitted.
In the wake of the criticism it faced over the impact of political advertisements on the 2016 United States elections, Facebook has sought to more carefully regulate what posts are approved for boosting. Its initial efforts in this regard have seen many advertisements being disallowed despite not necessarily falling afoul of the company’s new rules. Quartz also reported that Facebook has been taking down advertisements that seem political in India, even though its precise policies on this are unclear.
Disparate cases, one concern
All these cases involved different situations. Some were about pages and posts disappearing from Facebook or being suspended, often for unclear reasons. Others involved links being marked as spam. And The Caravan case was about links that could be posted but not boosted, potentially limiting their reach. However, all these instances have raised questions about the overall influence of Facebook over the news ecosystem, and its ability to suppress or promote certain kinds of information.
The company maintains an annual transparency report on enforcing its Community Standards, which details how much content has been taken down on government request or in compliance with local laws. This report says Facebook can restrict “content’s availability in the country where it is alleged to be illegal”. According to the latest figures in the report, Facebook restricted 3,142 pieces of Indian content in 2017, while 2,034 such requests were made in 2016.
The journalists who reported content restrictions by Facebook say the social media platform needs to come up with a better way of addressing restrictions. “Facebook should open a helpline for a big market like India, so that people can establish direct communication, as it’s very hard to reach out to Facebook for clarification,” said Ajay Prakash of Janjwar, whose page was marked as spam.