The current internet blackout in Kashmir is an egregious use of brute force by the government. It masks another, subtle but important mode of the censoring Kashmiri voices. In a time when the Kashmiri media is under siege, the major social media platforms are also removing calls for self-determination.

This means that Indian citizens face threats to their fundamental right to free expression from the Indian government as well as from private social media platforms. The Constitution and the judiciary offer us recourse, at least in theory, when the Indian government violates our rights. But there are currently no avenues in which we can question the social media platforms’ interference with these rights.

It is in this context that Facebook’s new Oversight Board, publicised as the company’s effort to protect freedom of expression, is relevant.

What is the Oversight Board?

Facebook has taken a decisive step towards setting up its external Oversight Board by launching its Oversight Board Charter in September. The Oversight Board is a revolutionary attempt by a social media platform to build an adjudicatory mechanism to check its content moderation decisions.

It may be the first time an online intermediary has conducted extensive global consultations, acknowledging its impact across borders. It is certainly the first time that a social media company has considered bringing external, independent actors into its decision making process.

This is a dramatic, resource-intensive effort by Facebook to create an improved, more legitimate process for its content moderation decisions. In this piece, we highlight the Board’s role in safeguarding freedom of expression online. We then offer our assessment of the limitations of the Board by design and the contexts in which it may have impact.

The Charter declares that the purpose of the Oversight Board is to protect freedom of expression and references the international human rights of freedom of expression standard. It points out that the company is confronted with difficult decisions about content when free expression conflicts with other “values” that Facebook upholds. Some of these values, like dignity and privacy, are widely accepted as reasons for which free expression might be restricted. Others like “authenticity” are more ambiguous and potentially questionable, depending on how they are framed.

Facebook is, however, making an effort to safeguard freedom of expression more consistently through the independent Oversight Board, which is meant to limit and clarify the degree to which online speech will be restricted on the platform.

How the Board can help

The Oversight Board will go a long way in developing a standard for hard cases where Facebook is confronted with conflicting internal values. A classic example is Facebook’s removal of the Napalm Girl photograph for featuring child nudity. After public furore about the censorship of the iconic Vietnam war photograph, Facebook decided to allow the image on its platform. The Board can help the company make better decisions about cases like this.

There is also great value in the explanation and opportunity for appeal that this Board will offer to Facebook users, although this effort should not replace or remove resources from Facebook’s existing appeals mechanisms. In fact, the oversight system would work best if these are strengthened and financially supported to afford minimum guarantees of due process.

The iconic Napalm Girl photo. Credit: Nick Ut / The Associated Press/Wikimedia Commons

The added value of the Oversight Board is that the platform’s decisions may seem more legitimate as users get insight into why decisions are made a certain way. Facebook does not appear to have systematically involved freedom of expression experts in its decision-making about content moderation in the past. An independent body consisting of such experts is likely to help improve the company’s content decisions, and is an encouraging step.

The limits of the Board

We must, however, be mindful of the limits of the Oversight Board. The “values” are broadly articulated and unless the Board is given the freedom and the capacity to define them and indicate how they balance each other out, they offer Facebook considerable room to retain some of its more controversial policies.

Additionally, Facebook has reserved the right to appoint the initial members of the Board, and the success or failure of the board will depend on whether they make good choices and draw the meanings and boundaries of values in a way that supports human rights.

Content moderation that Facebook sees as necessary for compliance with local laws will not come within the purview of the Board. For example, if Facebook decides incorrectly that local laws in India require it to silence Kashmiri calls for self-determination, the Board will not be able to review this decision.

This is because the Oversight Board is limited to decisions made by Facebook under its own community standards for content. It cannot do anything about the many serious content moderation problems that result when governments order censorship or when platforms self-censor out of fear of reprisal from governments. The limited scope of the Oversight Board means that we still lack a mechanism to address direct and indirect government restrictions of freedom of expression online.

A larger battle

It seems feasible that this process of independent third party review created by Facebook will extend to other companies’ content decisions in the future. Facebook has clarified that the Trust that will govern the Oversight Board is structured such that it may be possible for other companies to join in later. If other companies like Twitter do elect to join the Oversight Process, it will wield even more influence over online freedom of expression.

In sum, this potentially game-changing step taken by Facebook is an encouraging beginning of a company’s attempt to improve its accountability to users. It is helpful to see it as a starting point, an enormously difficult and significant step down a long road towards legitimate, human rights enabling regulation.

While this may help with some kinds of platform censorship, it will do nothing about the censorship that ensues when foreign platforms overcomply with the Indian government shrinking the space available for Indian political speech. Kashmiris will gain little, if their constitutionally protected protests are treated as illegal speech and their voices remain silenced.

Agustina Del Campo is the Director at the Center for Studies on Freedom of Expression and Access to Information at Universidad de Palermo and an international human rights consultant.

Chinmayi Arun is a fellow of the Information Society Project at Yale Law School and an Affiliate of the Berkman Klein centre at Harvard University,