Mastodon, which positions itself as a better alternative to Twitter, could be one of the rare social media sites to suspend a law enforcement agency’s account. Recently, it suspended Assam Police’s handle saying the platform will “not welcome cops”.
Angered by the suspension of the Twitter account of Supreme Court advocate Sanjay Hegde, Mastodon has seen a trickle of Indian Twitter users joining its platform. Several liberal Twitter users in India had accused the micro-blogging site of censoring anti-government handles and failing to control hate speech, allegations which Twitter India refuted.
Mastodon, which is not owned by any one person and is decentralised, was proposed as an alternative to Twitter. It is a free and open-source social networking service where users are allowed to host their own servers in the network. The free and open-source social networking service is similar to its rival in design, with ‘toots’ replacing tweets and ‘boosts’ replacing retweets.
It also has hashtags and a 500 character limit per toot. For a fee, anyone can host their own server known as an ‘instance’, or users can join any instance as these servers are connected as a federated social network, allowing users from different servers to interact with each other.
Mastodon is so focused on the concept of decentralisation that it does not have an official app and uses different apps created by others. Instances can choose to remain private or interact with each other.
Several news organisations such as Live Law, News 18, The Quint and as well as BOOM have joined Mastodon. We spoke to Eugen Rochko from Germany, who founded Mastodon in 2016.
Rochko administers the main instance Mastodon.social, which is currently home to over 4 lakh users, while other instances are owned and moderated entirely separate from his flagship instance.
We started by asking Rochko the reason behind his instance’s decision to ban Asssam Police and how Mastodon plans to tackle misinformation. Below are edited excerpts from the conversation.
A recent development on Mastodon was that a sate police account, belonging to the Assam police, was suspended.
Yes, our community felt unsafe in the presence of law enforcement. Essentially, it...[had] a chilling effect on people’s speech and it was decided by all of our moderators to do this.
As the number of users increase, how will Mastodon tackle problems like targeted harassment and trolling? What about other instances that are smaller?
It is true that some problems are lacking on Mastodon due to its size, relative to other social media platforms. But some problems are not [there] because of the structure...Mastodon is a decentralised platform. We have a higher number of moderators per user and [their] responsibility is shared across the whole network...It also allows multiple communities with different values to coexist...[in] spaces where they can they can conduct their own rules....Mastodon is fundamentally different from Facebook and Twitter and you won’t see the same problems because of its design.
How many moderators does your instance have?
We have five moderators...Just a few days [ago], we added a new moderator who speaks Hindi...I don’t count myself as a moderator but I help with moderating sometimes. They cover a variety of user-flagged reports in different languages and from different time zones. We’ll have to see what the demand [and] reporting is like [for] other Indian languages like Bengali [or] Tamil...before adding new moderators. On Mastodon.social, we have paid moderators, but...you are not obliged to structure it that way. It is fully acceptable...[in] some servers that are smaller...for moderators to simply be volunteers...[and] help their community.
BOOM spotted a fake account impersonating journalist Rana Ayyub. How will you tackle the problem of fake accounts as there is no verification badge on Mastodon?
I see verification and fake accounts as two different things, because Twitter has verification but that does not stop people from making fake accounts [there]. There are a variety of accounts that impersonate others, parody others and they are not helped by verification...[The way to address this] is getting rid of fake accounts...[which] depends on reporting. A few days ago, we had a person report [to us] that a fake Mastodon account was created in their name and...wanted that account to be transferred to them. And we did [that] after investigating [the report].
We noticed that Inditoot – a Mastodon instance based in India – has a leaner moderation policy. How does this work as instances can define their own moderation policy.
As a user, one has few different options – one can mute and block, which can hide individual trolls. The next...[option is to] hide all content from a particular instance...You can also contact your admin [to] take action on behalf of your instance, which is the harshest step, as the admin can cut off access to and from a particular instance. We have received reports about Inditoot, and...seen that the admin has claimed that hate speech will not be moderated. So for that reason, we have, on mastodon.social, silenced Inditoot from our side...[This] means their content does not appear on people’s notifications.
How will you deal with hate-driven hashtags? Do you have an option where people can report something fake?
The trending hashtag system on Mastodon is developed with this kind of problem in mind. We expect that hashtags can be misused by bots or masses that can...[start to trend, including]...harassment, complaints or misinformation. So before anything trends, it has to be approved. We try to make sure that the hashtags are clean and they are not ads and that they are not calling for violence.
Is there a focus on moderating content?
We do not watch all [the] content that goes through the system. We rely on people’s reports [and] reports from the community help us find stuff that violates the code of conduct. Some of us watch the local timeline – where toots from a local instance appear – where we will notice if something is happening. But it is very helpful when people report stuff, rather than [us] review[ing] hashtags when they are used.
Once something is flagged, such as a fake article, can the platform detect it going forward and warn users?
We do not have a flagging system for links, but that sounds like a good idea. So maybe we’ll have it in the future.
Are direct messages on Mastodon encrypted?
Twitter DMs are not encrypted...Mastodon DMs are encrypted the same way as Twitter’s – in [the] way that they are not. There is [an] encryption between the browser and the website of course, but that’s it. It is not out of the question that Mastodon might implement end-to-end encryption for DMs in the future. However, it is not [a] hundred percent obvious why [a] social media website would need that, because it is not primarily oriented towards a private one-on-one communication system.
This is a lightly edited version of an article that first appeared on BoomLive.