The Internet and Mobile Association of India’s Self-Regulation of “Online Curated Content Providers” is an exercise in trying to set up norms for what should and shouldn’t be seen as responsible content online. While it is completely legitimate for a few content providers, chief among them Netflix and STAR India, along with Jio, Arre, Eros Now, Alt Balaji, Zee5, Viacom18 and Sony Pictures Networks, via the IAMAI, to decide to collectively censor certain types of content, or to conform to a norm, the question really is whether it was necessary, and whether the norms they’ve chosen to conform to are proportionate. A few points:

1. Giving the hecklers the veto: There was no need for this code

While the Ministry of Information and Broadcasting had started a consultation process on regulating online content last year, the process was very quickly put into cold storage, and after Smriti Irani’s departure from the Ministry passed on to MEITY [Ministry of Electronics and Information Technology], which doesn’t really have the remit to look into content regulation. Broadly the governments concerns have been around the usage of WhatsApp to spread misinformation and malicious content, and video streaming services aren’t being seen as a concern. Thus, if the video streaming services are not the problem, why are they trying to be a solution? What are they trying to solve with this code? There is no convincing answer to this question.

Online streaming content has been at the forefront of pushing boundaries, in terms of starting debates on what is acceptable or not as content. Among them is Netflix for Sacred Games, which had uncensored nudity and cuss words, and, which has been, along with STAR India, leading the push for signing this code. In India, Movies in theatres and TV are censored excessively: even international films have found, in the past, that scenes have been cut, words and phrases changed in dubbing, and subtitles changed to avoid using certain words. Some examples here and here. These platforms are acting out of self-preservation: they don’t want to face frivolous lawsuits that are the bane of the Indian content industry, but in the process, they are also restricting pushing at the boundaries of what is considered moral and immoral in India. Few people might recall, but back in the 1990s, the song Sexy, Sexy, Sexy led to protests and the words being changed to Baby, Baby, Baby. The word sexy is no longer problematic. Thousands of people will be offended by something or the other. These platforms seem to be giving the hecklers a veto.

2. If it’s not illegal, will it be censored? Who decides? Where’s the transparency?

There are those who hated AIB’s roast, and there are those who didn’t mind it. What’s important here is that that the content was not necessarily illegal, something which is yet to be determined by the courts. The curated content code draws from the much criticised Section 295A of the IPC, also called the Blasphemy Law, by incorporating its vagueness that “Content which deliberately and maliciously intends to outrage religious sentiments of any class, section or community”. In the same way, it incorporates a variation of the vague sedition law, which is currently under scrutiny for its alleged misuse, by saying that “Content which deliberately and maliciously promotes or encourages terrorism and other forms of violence against the State (of India) or its institutions”, even though it limits itself to violence and terrorism, instead of “disaffection”.

So what will they do? In order to avoid judicial scrutiny, and thus being acted on by the government, the platforms will set up their own committees to ascertain the legality of content (or act on complaints of blasphemy or sedition). This is also reminiscent of how Section 79 of the IT Act (Intermediary Liability Protections) were being dealt with before the Supreme Court wrote it down to create a stronger due diligence process: platforms were getting complaints and instead of taking on liability, they were taking down content. Lawyers will not take the risks that creative people might when it comes to protecting free speech.

The only way that these platforms – even though they are private platforms – can be expected to protect free speech is if there is transparency to consumers: a listing with details of the complaint filed, decisions made and action taken. No surprisingly, the self-regulatory code has no such provisions for transparency. True to form, Hotstar quietly took down the controversial Koffee with Karan episode.

3. The risk of the creation of an online content censorship regime

At this point in time, there is no government norm for regulating content, an activity that is fraught with risk, since users would push back to censorship of content and speech online. By creating a self-regulatory code, these providers have created a situation where the government and courts can treat this code as a standard industry practice and force others to conform to it, even if the content may not be illegal. India is a country where anything you say is bound to offend at least someone, and thus the protection of the right to offend (an integral part of free speech) becomes even more essential. I wouldn’t be surprised if the Ministry of Information and Broadcasting to take this code as a starting point for creating a regulatory code for all content providers.

The platforms don’t need to do this, even if it’s entirely up to them to choose to do it.

4. The tactical separation

What is notable in the code is that it’s an “Online Curated Content Providers” code: this is a tactical separation of the content industry from the UGC or the User Generated Content space. Historically, internet regulation has not distinguished between video on demand providers and user-generated content. On the internet, every consumer is also a creator – a user – and there is no separation between creators and consumers. This was also the premise of the net neutrality debate in India, which STAR India supported. Platforms can be curated content only, UGC only, or even have a mix of both. Creators can be professional studios, or a student in a college.

By creating a distinction between UGC platforms like YouTube, Vimeo, Dailymotion and others, and professional/curated content platforms like Netflix, Hotstar and others, the content industry is creating grounds for differential regulation of intermediaries/platforms.

Perhaps that was the intention all along.

This article first appeared on Medianama.