This special compilation demonstrates that addressing the contested space of online extreme speech and internet communication is a layered and complex challenge that requires collaboration from multiple stakeholders.
Below we have articulated a set of priorities that emerged over the course of the discussions and writings. This list is not comprehensive and does not capture or represent every viewpoint but seeks to jumpstart a larger and more comprehensive discussion on addressing the challenges facing online content in India today.
Evidence based policy and action
Policy measures taken at the government and company level need to be grounded in evidence and the realities of content creation and dissemination as well as consumption and impact in India. Determination of content needs to be grounded in understandings of social and political context, positionality, and intent.
Identifying impact and harms
Identifying impact and harms from content online is an evolving area that requires further research. How do we identify extreme content? What is the impact of such content on individuals, communities and societies? How do we balance curtailing content with a person’s freedom of expression? What role and responsibility does each stakeholder have in identifying extreme content and harms? Understanding the link, if any, of these harms to company practice and government policy or gaps would be an important step.
Laws and enforcement
Measures and implementation need to be rights respecting and in line with the Constitution. The laws and their implementation need to be contextualised to the nature and the differences of the digital medium. The gap between law and enforcement is an area that needs to be addressed.
Algorithms and content moderation practices
Companies could take steps to bring more transparency to the way their personalisation and content moderation algorithms and practices work. Capacity to do content moderation across multiple Indian languages needs to be a focus area. If the content of the training material is created with public feedback and oversight mechanisms for decisions taken are clear, it could enable more trust in the system. Creative solutions are needed to address the scalability problem of content moderation.
Code of conduct
The Voluntary Code of Ethics established between companies, Internet and Mobile Association of India, and the Election Commission in India is a positive step towards establishing a co-regulatory framework. This code could be built upon to be more comprehensive and address some of the pressing issues articulated in this compilation.
There is increasing recognition that the business model of social media companies exploits users. Users and regulators in India should explore ways to hold companies accountable for their practices around data and advertising (especially political advertising) and the impact it has on content the users are exposed to, and the privacy of users.
As practices around electioneering, such as micro targeting, become more data centric, it is important that strong safeguards for privacy exist. As a first step, the draft Data Protection Bill, 2018, recognises political and religious affiliations as well as caste and tribe as sensitive personal information. This needs to be enacted into a comprehensive privacy legislation in India.
As fact-checking initiatives in India continue to expand and new ones emerge, accreditation and coordination will be important in ensuring effectiveness of efforts. An industry-wide resource sharing mechanism for fact-checking and reporting that can coordinate what is now a scattered array of reactive efforts is an important first step towards creating a coordinated approach to systematically addressing disinformation in India.
A number of companies including Facebook, Twitter, and Google are providing training to politicians and users. These efforts can be continued to be strengthened through collaboration with civil society as well as transparency to ensure comprehensiveness and independence. All stakeholders should take an active role in digital training.
It is important that the government develops appropriate, proportionate and effective regulation for social media companies that makes these companies accessible and accountable to the government.
Large media vs social media
The journalistic code of ethics should be revisited to enable coordination of guidelines developed by the television industry and guidelines of the Press Council of India and the social media companies. This coordination would be crucial in not only bringing social media content moderators within the scope of media self-regulation but also strengthening existing professional media standards, now largely seen as meek and toothless.
Data access for research
Social media companies should collaborate with the research community and offer access to their data archives for bona fide research of extreme content and offer training modules to use their data following due process of verification and adherence to professional standards.
Alternative narratives, counter speech and freedom of expression
Learnings from the above aspects need to inform a robust discourse around freedom of expression and its contours in the societal and political climate. Critical questions facing India include how to protect against freedom of expression being used as a vehicle for harm while upholding and ensuring it as a fundamental human right. In addition to company and government action, a people-centric perspective should be adopted to encourage grassroots counter-speech initiatives and build community networks that can distribute fact-checking information and fact-checked content. These efforts should connect, coordinate and scale up to build resilience to counter misinformation and harmful extreme speech.
Elonnai Hickok is Chief Operating Officer at the Centre for Internet and Society, India. Sahana Udupa is Professor of Media Anthropology at the University of Munich (LMU), Germany.