Facebook has constituted a new content oversight board that will decide if specific material should be removed from Facebook or Instagram. The board will focus on content issues that include hate speech and harassment and people’s safety. It also has the power to overrule decisions taken by the company and Chief Executive Mark Zuckerberg.

Sudhir Krishnaswamy, Vice Chancellor of the National Law School of India University in Bengaluru, is the only Indian who is part of the board. It currently has 20 members, including a former Danish prime minister and a Nobel laureate.

Krishnaswamy is part of the team that will review content enforcement decisions. “Content moderation and content control has been a problem in most jurisdictions around the world, India being one of the most affected,” he told IANS. “Currently, content is controlled either by private companies or the government. If this mechanism works, it provides us with a new institutional model for handling content moderation in the future. This is as important to the future of democracy as it is to the market.”

Krishnaswamy is also the co-founder of the Center for Law and Policy Research which works to advance constitutional values for everyone, including LGBTQ+ and transgender persons in India, through research, advocacy and impact litigation. He has been the Director of the School of Policy and Governance and Professor of Law and Politics at Azim Premji University, and the Dr BR Ambedkar Visiting Professor of Indian Constitutional Law at Columbia Law School.

The need for such an oversight board arose because in the past, Facebook has often been criticised for high-profile content moderation, Reuters reported. In 2016, Facebook had removed the iconic Vietnam war photo featuring a naked nine-year-old girl. After widespread criticism, the company had reconsidered its decision and allowed it to be shared on the platform. The social media giant has also received flak for failing to combat hate speech in Myanmar against the Rohingyas and other Muslims.

The board will review whether content is consistent with Facebook and Instagram’s policies and values, while committing to uphold freedom of expression within the framework of international norms of human rights. “We will make decisions based on these principles, and the impact on users and society, without regard to Facebook’s economic, political or reputational interests,” the board said in a statement.

The oversight board will have 90 days to make decisions and implement it, though Facebook can ask for a 30-day review in exceptional cases. Based on case decisions, it can make policy recommendations to Facebook, and the company will have to respond publicly.

Facebook’s Head of Global Affairs Nick Clegg said that “the board will start work immediately and it would begin hearing cases this summer”. The board will eventually grow to 40 members and Facebook has pledged $130 million (approximately Rs 985 crore) to fund it for at least six years.

Other members

A quarter of the group and two of the four co-chairs are from the United States, where the company’s headquarters is located. The members of the board have lived in 27 countries and speak at least 29 languages.

The co-chairs, who selected the other members along with Facebook, include former US Federal Circuit Judge and religious freedom expert Michael McConnell, Constitutional Law Expert Jamal Greene, Colombian Attorney Catalina Botero-Marino and former Danish Prime Minister Helle Thorning-Schmidt.

The initial members of the group include former European Court of Human Rights judge András Sajó, Internet Sans Frontières Executive Director Julie Owono, Yemeni activist and Nobel Peace Prize laureate Tawakkol Karman, former Editor-In-Chief of the Guardian Alan Rusbridger, and Pakistani digital rights advocate Nighat Dad.

“We are not the internet police, don’t think of us as sort of a fast-action group that’s going to swoop in and deal with rapidly moving problems,” McConnell said.

Another board member and internet governance researcher Nicolas Suzor clarified that they were not working for Facebook, but trying to pressure Facebook to improve its policies and its processes to better respect human rights. “That’s the job. I’m not so naive that I think that that’s going to be a very easy job,” Suzor said.